Performance optimization tips and best practices

To explore further image classification improvement, in this section, we will try three experiments. In the first experiment, we will mainly use the adam optimizer when compiling the model. In the second experiment, we will carry out hyperparameter tuning by varying the number of units in the dense layer, the dropout percentage in the dropout layer, and the batch size when fitting the model. Finally, in the third experiment, we will work with another pretrained network called VGG16.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset