Determining the right activation function

The purpose of an activation function is to introduce non-linearity into a neural network. Non-linearity helps a neural network to learn more complex patterns. We will discuss some important activation functions, and their respective DL4J implementations.

The following are the activation functions that we will consider:

  • Tanh
  • Sigmoid
  • ReLU (short for Rectified Linear Unit)
  • Leaky ReLU
  • Softmax

In this recipe, we will walk through the key steps to decide the right activation functions for a neural network.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset