Neural network training

How do we train a neural network? Basically, we will provide the with a set of input data as well as the results we expect to see, which correspond to those inputs. That data is then run through the network until the network understands what we are looking for. We will train, test, train, test, train, test, on and on until our network understands our data (or doesn't, but that's a whole other conversation). We continue to do this until some designated stop condition is satisfied, such as an error rate threshold. Let's quickly cover some of the terminology we will use while training neural networks.

Back propagation: After our data is run through the network, we to validate that data what we expect to be the correct output. We do this by propagating backward (hence backprop or back propagation) through each of the Hidden Layers of our network. The end result is that this adjusts the weights assigned to each of the neuron's inputs in the Hidden Layers as well as our error rate.

Each back propagation layer should, in a perfect world, make our network output closer to what we are expecting, and our error rate will get closer and closer to 0. We may never get to an exact error rate of 0, so even though it may seem not much of a difference, an error rate of 0.0000001 could be more than acceptable to us.

Biases: Biases allow us to modify our function so that we can generate better output for each neuron in our network. In short, a bias allows us to shift the activation function value to the left or the right. Changing the weight changes the steepness or vertical aspect of the Sigmoid.

Momentum: Momentum simply adds a fraction of the previous weight update to the current one. Momentum is used to prevent the system from converging on a local minimum rather than the global minimum. High momentum can be used to help increase the speed of convergence of the system; however, you must be careful as setting this parameter too high can create a risk of overshooting the minimum, which will result in an unstable system. On the other hand, a momentum that is too low cannot reliably avoid local minima, and it can also really slow down the training of the system. So, getting this value correct is paramount for success and something you will spend a considerable amount of time doing.

Sigmoid function: An activation function defines what each neuron's output will be. A Sigmoid function is perhaps the most commonly used activation function. It converts the input into a value which lies between 0 and 1. This function is used to generate our initial weights. A typical Sigmoid function will be able to accept an input value and, from that value, provide both an output value and a derivative.

Learning rate: The learning rate will change the overall learning speed of the system by controlling the size of the weight and bias changes made to the network during the learning phase.

Now that we have this terminology behind us, let's start digging into the code. You should have downloaded the solution from the accompanying software provided for the book, and have it opened in Visual Studio. We use the Community Edition of Visual studio, but you may use whichever version you have.

Feel free to download the software, experiment with it, and embellish it if you need or want to. In your world, your neural network can be anything you like or need it to be, so make it happen. You have the source. Just because you see something one way doesn't make it gospel or written in stone! Learn from what these great open source contributors have provided for us! Remember, this neural network is meant only to give you some idea of the many things that you could do writing your own, as well as teach you some of the basics when it comes to a neural network.

Let's start off by looking at some brief code snippets that will set the stage for the rest of the chapter. We'll start first with a little thing called a synapse, which connects one neuron to another. Next, we'll start coding exactly what an individual neuron is, and finally move into discussing forward and backward propagation and what that means to us. We'll show everything in the form of code snippets to make it easier to understand.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset