Do not initialize all weights with zeros

We might be inclined to now think that setting all our weights to zero will achieve maximum symmetry. However, this is actually a very bad idea, and our model will never learn anything. This is because when you do a forward pass, every neuron will produce the same result; so, during the backpropagation step, all the weights will update in the same way. This means the model can never learn an informative set of features, so don’t initialize like this.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset