Objective functions

The objective function (also called the loss function or cost function) will help your network to minimize. It works by selecting a training instance, running it through your neural network, then computing the loss of the output.

The derivative of the loss function is updated for finding the parameters of the model. Like, if your model predicts an answer confidently, and the answer turns out to be wrong, the the computed loss will be high. If the predicted answer is correct, then the loss is low.

How is the network minimized?

  1. First, the function will select a training instance
  2. Then, it is passed through our neural network to get the output
  3. Finally, the loss of the output is calculated

In our training examples we need to minimize the loss function to minimize the probability of wrong results with the actual dataset. 

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset