The Rectified Linear Unit function

The Rectified Linear Unit (ReLU) function is another one of the most commonly used activation functions. It outputs a value from o to infinity. It is basically a piecewise function and can be expressed as follows:

That is, returns zero when the value of x is less than zero and returns x when the value of x is greater than or equal to zero. It can also be expressed as follows:

The ReLU function is shown in the following figure:

As we can see in the preceding diagram, when we feed any negative input to the ReLU function, it converts it to zero. The snag for being zero for all negative values is a problem called dying ReLU, and a neuron is said to be dead if it always outputs zero. A ReLU function can be implemented as follows:

def ReLU(x):
if x<0:
return 0
else:
return x
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset