Dropout

A dropout mechanism is a regularization technique where some neurons are left out during training; hence, the weights are regularized. The following diagram shows a neural network with dropout on the right-hand side, and a standard network on the left:

In effect, dropout prevents the network from giving too much importance to any single node or feature, which may result in overfitting. The weight values are therefore spread across different nodes, regularizing the output. Another regularization technique that works on the data itself is batch normalization, which will be explained next.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset