Chapter 9. Optimizing and Adapting Neural Networks

In this chapter, the reader will be presented with techniques that help to optimize neural networks, in order to get the best performance. Tasks such as input selection, dataset separation and filtering, choosing the number of hidden neurons, and cross-validation strategies are examples of what can be adjusted to improve a neural network's performance. Furthermore, this chapter focuses on methods for adapting neural networks to real-time data. Two implementations of these techniques are presented here. Application problems will be selected for exercises. This chapter deals with the following topics:

  • Input selection
  • Dimensionality reduction
  • Data filtering
  • Structure selection
  • Pruning
  • Validation strategies
  • Cross-validation
  • Online retraining
  • Stochastic online learning
  • Adaptive neural networks
  • Adaptive resonance theory

Common issues in neural network implementations

When developing a neural network application, it is quite common to face problems regarding how accurate the results are. The source of these problems can be various:

  • Bad input selection
  • Noisy data
  • Too big a dataset
  • Unsuitable structure
  • Inadequate number of hidden neurons
  • Inadequate learning rate
  • Insufficient stop condition
  • Bad dataset segmentation
  • Bad validation strategy

The design of a neural network application sometimes requires a lot of patience and the use of trial and error methods. There is no methodology stating specifically which number of hidden units and/or architecture should be used, but there are recommendations on how to choose these parameters properly. Another issue programmers may face is a long training time, which often causes the neural network to not learn the data. No matter how long the training runs, the neural network won't converge.

Tip

Designing a neural network requires the programmer or designer to test and redesign the neural structure as many times as needed, until an acceptable result is obtained.

On the other hand, the neural network solution designer may wish to improve the results. Because a neural network can learn until the learning algorithm reaches the stop condition, the number of epochs or the mean squared error, the results are not accurate enough or not generalized. This will require a redesign of the neural structure, or a new dataset selection.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset