Building network components in PyTorch

Before shifting our focus to NLP, in this section we will use non-linearities and affine maps to build a network in PyTorch. In this example, we will learn to compute a loss function using the built in negative log likelihood in PyTorch and using backpropagation for updating the parmeters.

Please note that all the components of the network need to be inherited from nn.Module and also override the forward() method. Considering boilerplate, these are the details we should remember. The network components are provided functionality when we inherit those components from nn.Module

Now, as mentioned previously, we will look at an example, in which the network takes a scattered bag-of-words (BoW) representation and and the output is a probability distribution into two labels, that is, English and Spanish. Also, this model is an example of logistic regression.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset