Summary

In this chapter, we focused on text generation and text summarization. Using GRU and RNN, we illustrated an example text generation model that can generate Linux kernel code. Such models, when applied to different domains or source input texts, can help us to understand the underlying structure and context. Next, we described the different types of text summarization. We explained a simple extractive summarization approach, using gensim to generate product review summaries. While extractive summarization reproduces words from the source text, abstractive summarization can generate novel and intuitive summaries.

To cover abstractive summarization, we introduced an encoder-decoder model, using GRU and RNN to summarize news text. We used CNN news text as input data to produce short summaries of the news. Finally, we looked at some state of the art approaches to improve upon our base encoder-decoder model (with attention). You can build on the base model that we developed, in order to incorporate these enhancements. One simple enhancement would be to add additional features, such as POS, NER, and TF-IDF, to the word embedding of the input text. In the next Chapter, we will look at another interesting topic of question-answering and chatbots. We will be developing a question answering model and build a chatbot using generative RNN model.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset