Compiling the model

To configure the learning process for the neural network, we compile the model by specifying the loss, optimizer, and metrics, as shown in the following code:

# Compile model 
model %>%
compile(loss = 'categorical_crossentropy',
optimizer = 'adam',
metrics = 'accuracy')

We use loss for specifying the objective function that we want to optimize. As shown in the preceding code, for the loss, we use 'categorical_crossentropy', since our target variable has three categories. For situations where the target variable has two categories, we use binary_crossentropy. For the optimizer, we use the 'adam' optimization algorithm, which is a popular algorithm for deep learning. Its popularity is mainly due to the fact that it gives good results faster than other stochastic optimization methods, such as the adaptive gradient algorithm (AdaGrad) and root mean square propagation (RMSProp). We specify the metrics for evaluating the model performance during training and testing. For metrics, we use accuracy to assess the classification performance of the model.

Now we are ready to fit the model, which we will do in the next section. 

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset