Hyperparameter tuning and feature selection

The flexibility of neural networks is also one of their main drawbacks: there are many hyperparameters to tweak. Even in a simple MLP, you can change the number of layers, the number of neurons per layer, the type of activation function to use in each layer, the number of epochs, the learning rate, weight initialization logic, drop-out keep probability, and so on. How do you know what combination of hyperparameters is best for your task?

Of course, you can use grid search with cross-validation to find the right hyperparameters for linear machine learning models, but for deep learning models, there are many hyperparameters to tune. And since training a neural network on a large dataset takes a lot of time, you will only be able to explore a tiny part of the hyperparameter space in a reasonable amount of time. Here are some useful insights.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset