How to Use the TimeDistributed Layer in Keras
Last Updated on August 14, 2019
Long Short-Term Networks or LSTMs are a popular and powerful type of Recurrent Neural Network, or RNN.
They can be quite difficult to configure and apply to arbitrary sequence prediction problems, even with well defined and “easy to use” interfaces like those provided in the Keras deep learning library in Python.
One reason for this difficulty in Keras is the use of the TimeDistributed wrapper layer and the need for some LSTM layers to return sequences rather than single values.
In this tutorial, you will discover different ways to configure LSTM networks for sequence prediction, the role that the TimeDistributed layer plays, and exactly how to use it.
After completing this tutorial, you will know:
- How to design a one-to-one LSTM for sequence prediction.
- How to design a many-to-one LSTM for sequence prediction without the TimeDistributed Layer.
- How to design a many-to-many LSTM for sequence prediction with the TimeDistributed Layer.
Kick-start your project with my new book Long Short-Term Memory Networks With Python, including step-by-step tutorials and the Python source code files for all examples.
Let’s get started.
- Update Jun/2019: It seems that the Dense layer can now directly support 3D
To finish reading, please visit source site