Benefits and limitations

The advantages and disadvantages of neural networks depend on which other machine learning methods they are compared to. However, neural-network-based classifiers, particularly the multilayer perceptron using error backpropagation, have some obvious advantages, such as:

  • The mathematical foundation of a neural network does not require expertise in dynamic programming or linear algebra, beyond the basic gradient descent algorithm.
  • A neural network can perform tasks that a linear algorithm cannot.
  • MLP is usually reliable for highly dynamic and nonlinear processes. Contrary to the support vector machines, they do not require us to increase the problem dimension through kernelization.
  • MLP does not make any assumption on linearity, variable independence, or normality.
  • The execution of training of the MLP lends itself to concurrent processing quite well for online training. In most architecture, the algorithm can continue even if a node in the network fails.

However, as with any machine learning algorithm, neural networks have their detractors. Among the most documented limitations are as follows:

  • MLP models are black boxes for which the association between features and classes may not be easily described.
  • MLP requires a lengthy training process, especially using the batch strategy. For example, a two-layer network has a time complexity (number of multiplications) of O(n.m.p.N.e) for n input variables, m hidden neurons, p output values, N observations, and e epochs. It is not uncommon that a solution emerges after thousands of epochs. The online training strategy using momentum factor tends to converge faster and require a smaller number of epochs than the batch process.
  • Tuning the configuration parameters, such as learning rate, selection of the activation method, application of softmax transformation, or momentum factor, can turn into a lengthy process.
  • Estimating the minimum size of the training set to get accurate results is not obvious.
  • A neural network cannot be incrementally retrained. Any new labeled data requires an entirely new training cycle.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset