Classification tasks can be done by any of the supervised neural networks this book has covered so far. However, it is recommended that you use more complex architectures such as MLPs. In this chapter, we are going to use the NeuralNet
class to build an MLP with one hidden layer and the sigmoid function at the output. Every output neuron will mean a class.
The code used to implement the examples is very similar to the test class (BackpropagationTest
). However, the class DiagnosisExample
asks which dataset the user would like to use and other neural network parameters, such as number of epochs, number of neurons in hidden layer, and learning rate.