Regularization techniques

Overfitting is a problem in ML, where a model blindly learns all of the patterns in the data, including noise. A neural network can easily be overfitted during training, due to the availability of a large number of parameters. Theoretically, given any size of input data, a large enough Artificial Neural Network (ANN) can memorize all of the patterns in it, along with noise. Therefore, the weights of models have to be regularized to avoid overfitting the data.

We will look at three types of regularization:

  • Dropout
  • Batch normalization
  • L1 and L2 normalization
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset