Developing deep neural networks

Although we won't be developing a classification model based on just five tweets, let's look at the code for our model's architecture:

model <- keras_model_sequential()
model %>% layer_embedding(input_dim = 10,
output_dim = 8,
input_length = 5)
summary(model)

OUTPUT
__________________________________________________________________________________
Layer (type) Output Shape Param #
==================================================================================
embedding_1 (Embedding) (None, 5, 8) 80
==================================================================================
Total params: 80
Trainable params: 80
Non-trainable params: 0
________________________________________________________________________________

print(model$get_weights(), digits = 2)
[[1]]
[,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8]
[1,] 0.0055 -0.0364 -0.0475 0.049 -0.0139 -0.0114 -0.0452 -0.0298
[2,] 0.0398 -0.0143 -0.0406 0.023 -0.0496 -0.0124 0.0087 -0.0104
[3,] 0.0370 -0.0321 -0.0491 -0.021 -0.0214 0.0391 0.0428 -0.0398
[4,] -0.0257 0.0294 0.0433 0.048 0.0259 -0.0323 -0.0308 0.0224
[5,] -0.0079 -0.0255 0.0164 0.023 -0.0486 0.0273 0.0245 -0.0020
[6,] 0.0372 0.0464 0.0454 -0.020 0.0086 -0.0375 -0.0188 0.0395
[7,] 0.0293 0.0305 0.0130 0.037 -0.0324 -0.0069 -0.0248 0.0178
[8,] -0.0116 -0.0087 -0.0344 0.027 0.0132 0.0430 -0.0196 -0.0356
[9,] 0.0314 -0.0315 0.0074 -0.044 -0.0198 -0.0135 -0.0353 0.0081
[10,] 0.0426 0.0199 -0.0306 -0.049 0.0259 -0.0341 -0.0155 0.0147

From the preceding code, we can observe the following:

  • We initialized the model using keras_model_sequential().
  • We specified the input dimension as 10, which is the number of most frequent words.
  • The output dimension of 8 leads to the number of parameters being 10 x 8 = 80.
  • The input length is the length of the sequence of integers.
  • We can get the weights for these 80 parameters using model$get_weights().
Note that these weights will change every time the model is initialized.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset