The leaky ReLU function

Leaky ReLU is a variant of the ReLU function that solves the dying ReLU problem. Instead of converting every negative input to zero, it has a small slope for a negative value as shown:

Leaky ReLU can be expressed as follows:

The value of is typically set to 0.01. The leaky ReLU function is implemented as follows:

def leakyReLU(x,alpha=0.01):
if x<0:
return (alpha*x)
else:
return x

Instead of setting some default values to , we can send them as a parameter to a neural network and make the network learn the optimal value of . Such an activation function can be termed as a Parametric ReLU function. We can also set the value of to some random value and it is called as Randomized ReLU function.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset