Loss functions

The loss function compares the output of a neural network with the target values in the training, producing a loss value/score to measure how well the predictions of the network match the expected value. In the last section, we saw the need for different types of loss functions for different tasks, such as regression and binary classification. Here are a few other popular loss functions:

  • Binary cross-entropy: The log loss or cross-entropy loss for the two-class classification problem discussed in last section on logistic units.
  • Categorical cross-entropy: If we have a K class classification problem, then we use the generalized cross-entropy for K classes.
  • Mean Squared Error: This is the mean sum squared error we've discussed several times. This is widely used for various regression tasks. 
  • Mean Absolute ErrorMeasures the average magnitude of the errors in a set of predictions, without considering their direction. It’s the average over the test sample of the absolute differences between prediction and actual observation. Mean Absolute Error (MAE) gives a relatively high weight to large errors as it squares the error. 
  • Mean Absolute Percentage ErrorMeasures the size of the error in percentage terms. It is calculated as the average of the unsigned percentage error. Mean Absolute Percentage Error (MAPE) is used because of the ease of interpretability of percentages.
  • Hinge loss/squared-hinge loss: Hinge loss is used in SVMs. They penalize marginally misclassified points differently. They are good alternatives to cross-entropy loss and lead to faster training for neural networks as well. Higher-order hinge losses, such as squared-hinge losses, are even better for some classification tasks. 
  • Kullback-Leibler (KL) divergence: KL divergence is a measure of how one probability distribution diverges from a second expected probability distribution. 
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset