Chapter 4 - Generating Song Lyrics Using an RNN 

  1. A normal feedforward neural network predicts output only based on the current input, but an RNN predicts output based on the current input and also the previous hidden state, which acts as a memory and stores the contextual information (input) that the network has seen so far.
  2. The hidden state, , at a time step, , can be computed as follows:

    In other words, this is hidden state at a time step, t = tanh([input to hidden layer weight x input] + [hidden to hidden layer weight x previous hidden state])

  3. RNNs are widely applied for use cases that involve sequential data, such as time series, text, audio, speech, video, weather, and much more. They have been greatly used in various natural language processing (NLP) tasks, such as language translation, sentiment analysis, text generation, and so on.
  4. While backpropagating the RNN, we multiply the weights and derivative of the tanh function at every time step. When we multiply smaller numbers at every step while moving backward, our gradient becomes infinitesimally small and leads to a number that the computer can't handle; this is called the vanishing gradient problem. 
  1. When we initialize the weights of the network to a very large number, the gradients will become very large at every step. While backpropagating, we multiply a large number together at every time step, and it leads to infinity. This is called the exploding gradient problem. 
  2. We use gradient clipping to bypass the exploding gradient problem. In this method, we normalize the gradients according to a vector norm (say, L2) and clip the gradient value to a certain range. For instance, if we set the threshold as 0.7, then we keep the gradients in the -0.7 to +0.7 range. If the gradient value exceeds -0.7, then we change it to -0.7, and similarly, if it exceeds 0.7, then we change it to +0.7.
  3. Different types of RNN architectures include one-to-one, one-to-many, many-to-one, and many-to-many, and they are used for various applications. 
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset