Calculation of parameters

The summary of the RNN model is as follows:

# Model summary
model

OUTPUT

Model
________________________________________________________________________
Layer (type) Output Shape Param #
========================================================================
embedding_21 (Embedding) (None, None, 32) 16000
________________________________________________________________________
simple_rnn_23 (SimpleRNN) (None, 8) 328
________________________________________________________________________
dense_24 (Dense) (None, 1) 9
========================================================================
Total params: 16,337
Trainable params: 16,337
Non-trainable params: 0
________________________________________________________________________

The number of parameters for the embedding layer is arrived at by multiplying 500 (number of most frequent words) and 32 (output dimension) to obtain 16,000. To arrive at the number of parameters for the simple RNN layer, we use (h(h+i) + h), where h represents the number of hidden units and i represents the input dimension for this layer. In this case, this is 32.

Thus, we have (8(8 + 32)+8) = 328 parameters.

Note that if we consider a fully connected dense layer here, we would have obtained (8 x 32 + 8) = 264. However, the additional 64 parameters are due to the fact that we use recurrent layers to capture sequences in the text data.

In recurrent layers, information from the previous input is also used, which leads to these extra parameters that we can see here. This is the reason why RNNs are better suited for handling sequence data compared to a regular densely connected neural network layer. For the last layer, which is a dense layer, we have (1 x 8 + 1) = 9 parameters. Overall, this architecture has 16,337 parameters.

In recurrent layers, the use of information from the previous input helps to provide a better representation of a sequence that is present in text or similar data that contains some kind of sequence.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset