Text Generation With LSTM Recurrent Neural Networks in Python with Keras
Last Updated on September 3, 2020
Recurrent neural networks can also be used as generative models.
This means that in addition to being used for predictive models (making predictions) they can learn the sequences of a problem and then generate entirely new plausible sequences for the problem domain.
Generative models like this are useful not only to study how well a model has learned a problem, but to learn more about the problem domain itself.
In this post you will discover how to create a generative model for text, character-by-character using LSTM recurrent neural networks in Python with Keras.
After reading this post you will know:
- Where to download a free corpus of text that you can use to train text generative models.
- How to frame the problem of text sequences to a recurrent neural network generative model.
- How to develop an LSTM to generate plausible text sequences for a given problem.
Kick-start your project with my new book Deep Learning for Natural Language Processing, including step-by-step tutorials and the Python source code files for all examples.
Let’s get started.
Note: LSTM recurrent neural networks can be slow to train and it is highly recommend that you
To finish reading, please visit source site