Summary

In this chapter, we learned how the LSTM cell uses several gates to combat the vanishing gradient problem. Then, we saw how to use the LSTM cell to predict a Bitcoin's price in TensorFlow.

After looking at LSTM cells, we learned about the GRU cell, which is a simplified version of LSTM. We also learned about bidirectional RNNs, where we had two layers of hidden states with one layer moving forward through time from the start of the sequence, while another layer moved backward through time from the end of the sequence.

At the end of the chapter, we learned about the seq2seq model, which maps an input sequence of varying length to an output sequence of varying length. We also understood how the attention mechanism is used in the seq2seq model and how it focuses on important information.

In the next chapter, we will learn about convolutional neural networks and how they are used for recognizing images.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset