How it works...

For step 1, if the neural network has only single hidden layer, then the number of neurons (inputs) in the hidden layer should be the same as the number of outgoing connections from the preceding layer. If you have multiple hidden layers, you will also need to confirm this for the preceding hidden layers.

After you make sure that the number of input neurons are the same as number of the outgoing neurons in the preceding layer, you can create hidden layers using DenseLayer. In step 2, we used DenseLayer to create hidden layers for the input layers. In practice, we need to evaluate the model multiple times to understand the network performance. There's no constant layer configuration that works well for all the models. Also, RELU is the preferred activation function for hidden layers, due to its nonlinear nature.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset