Plotting values on a graph

Let's work on one simple program to plot values on a graph. To do this, use the following code:

#This line is necessary to print the output inside jupyter notebook
%matplotlib inline

import torch
import matplotlib.pyplot as plt
import torch.nn.functional as F
from torch.autograd import Variable


# dummy data for the example
#lets declare linspace
x = torch.linspace(-5, 5, 200) # x data (tensor), shape=(100, 1)
x = Variable(x)
#call numpy array to plot the results
x_np = x.data.numpy()

Following code block lists down a few of the activation methods:


#RelU function
y_relu = torch.relu(x).data.numpy()
#sigmoid method
y_sigmoid = torch.sigmoid(x).data.numpy()
#tanh method
y_tanh = torch.tanh(x).data.numpy()
#softplus method
y_softplus = F.softplus(x).data.numpy() # there's no softplus in torch
# y_softmax = torch.softmax(x, dim=0).data.numpy() softmax is an activation function and it deals with probability

Using matplotlib to activate the functions:


#we will plot the activation function with matplotlib
plt.figure(1, figsize=(8, 6))
plt.subplot(221)
plt.plot(x_np, y_relu, c='red', label='relu')
plt.ylim((-1, 5))
plt.legend(loc='best')

#sigmoid activation function
plt.subplot(222)
plt.plot(x_np, y_sigmoid, c='red', label='sigmoid')
plt.ylim((-0.2, 1.2))
plt.legend(loc='best')

#tanh activation function
plt.subplot(223)
plt.plot(x_np, y_tanh, c='red', label='tanh')
plt.ylim((-1.2, 1.2))
plt.legend(loc='best')

#softplus activation function
plt.subplot(224)
plt.plot(x_np, y_softplus, c='red', label='softplus')
plt.ylim((-0.2, 6))
plt.legend(loc='best')

#call the show method to draw the graph on screen
plt.show()

Let's plot the values on the graph, as shown in the following screenshot:

Note that the first line in the preceding code is required to draw the graph inside Jupyter Notebook. If you are running the Python file directly from the Terminal, you can omit the first line of the code.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset