How it works...

The function begins with initialing new weights and biases. Then, perform matrix multiplication of the input layer with initialized weights and add relevant biases.

If, the fully connected layer is not the final layer of the CNN TensorFlow graph, ReLU non-linear activation can be performed. Finally, the fully connected layer is returned.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset