Backward cycle

The training of the network happens in the backward cycle, hence the name backpropagation algorithm. The difference between the actual and predicted value error is calculated. This error is walked back over the model and the weights in the model are adjusted. The derivative of the errors with respect to these weights gives us an idea as to how much these weights contributed to the overall error. That percentage of contribution is adjusted in the weight. Refer to https://en.wikipedia.org/wiki/Backpropagation to learn more.

Hopefully, this gave you an intuitive explanation on how neural networks learn from data. Let us go to the next section to learn more about the MXNet R library we will be using to build our deep learning network.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset