For back propagation (backprop), we will first calculate the gradient from the output layers, put those values through our Hidden Layer (reversing the direction we took in forward propagation), update the weights, and finally put the value through our output layers, as follows:
private void BackPropagate(params double[] targets)
{
var i = 0;
OutputLayer?.ForEach(a =>a.CalculateGradient(targets[i++]));
HiddenLayers?.Reverse();
HiddenLayers?.ForEach(a =>a.ForEach(b =>b.CalculateGradient()));
HiddenLayers?.ForEach(a =>a.ForEach(b =>b.UpdateWeights(LearningRate, Momentum)));
HiddenLayers?.Reverse();
OutputLayer?.ForEach(a =>a.UpdateWeights(LearningRate, Momentum));
}