Batch normalization

Batch normalization rescales the input and all intermediate layer output of the network, making the training smoother and faster. The rescaling is done to achieve a 0 mean and a standard deviation of 1 for all of the input and output. This helps the neural network to train faster and leads to some regularization.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset