Ensemble predictions using neural networks

Another approach to regularization involves combining neural network models and averaging out the results. The resultant model is the most accurate one.

A neural network ensemble is a set of neural network models taking a decision by averaging the results of individual models. Ensemble technique is a simple way to improve generalization, especially when caused by noisy data or a small dataset. We train multiple neural networks and average their outputs.

As an example, we take 20 neural networks for the same learning problem, we adjust the various parameters in the training processing, and then the mean squared errors are compared with the mean squared errors of their average.

The following are the steps followed:

  1. The dataset is loaded and divided into a train and test set. The percentage split can be varied for different neural net models.
  2. Multiple models are created with the different training sets and by adjusting the parameters in the nnet() function.
  3. All the models are trained and errors in each model are tabulated.
  4. The average error is found for each row in test data and the mean square error is calculated for each model.
  5. The mean square error is compared with the mean square error of the average.
  6. The best model is chosen from the comparison and is used further for prediction.

This method allows us to play with the data and the function parameters to arrive at the optimal setting of the model. We can choose any number of models in the ensemble and do parallel processing of the models using R.

Overfitting is highly reduced and the best parameters of the model are arrived at here.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset