Introducing non-linearity

In the convolution step, we talked about feeding the output of the convolution step to a ReLU activation function to introduce non-linearity:

Figure 9.6: ReLU activation function

The ReLU activation function replaces all the negative pixel values with zeros and the whole purpose of feeding the output of the convolution step to this activation function is to introduce non-linearity in the output image because this will be useful for the training process as the data that we are using is usually non-linear. To clearly understand the benefit of ReLU activation function, have a look at the following figure, which shows the row output of the convolution step and the rectified version of it:

Figure 9.7: The result of applying ReLU to the input feature map
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset