Recurrent and Convolutional Neural Networks

Until now, we have been studying feed-forward networks, where the data moves in one direction and there is no interconnection of nodes in each layer. In the presence of basic hypotheses that interact with some problems, the intrinsic unidirectional structure of feed-forward networks is strongly limiting. However, it is possible to start from it and create networks in which the results of computing one unit affect the computational process of the other. It is evident that algorithms that manage the dynamics of these networks must meet new convergence criteria.

In this chapter, we will introduce Recurrent Neural Networks (RNN), which are networks with cyclic data flows. We will also see Convolutional Neural Networks (CNN), which are standardized neural networks mainly used for image recognition. For both of these types of networks, we will do some sample implementations in R. The following topics are covered:

  • RNN
  • The rnn package
  • Long Short-Term Memory (LSTM) model
  • CNN
  • Common CNN architecture--LeNet

At the end of the chapter, we will understand training, testing, and evaluating an RNN. We will learn how to visualize the RNN model in R environment. We will also be able to train an LSTM model. We will cover the concepts as CNN and common CNN architecture--LeNet.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset