Generating text using RNNs

We used Long Short-Term Memory (LSTMs) and Gated Recurrent Units (GRUs) in previous chapters for text classification. In addition to being used for predictive tasks, RNNs can be used to create generative models, as well. RNNs can learn long-term dependencies from an input text, and can therefore generate completely new sequences. This generative model can be either character or word-based. In the next section, we will look at a simple word-based text generation model.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset