Stacked Long Short-Term Memory Networks
Last Updated on August 14, 2019
Gentle introduction to the Stacked LSTM
with example code in Python.
The original LSTM model is comprised of a single hidden LSTM layer followed by a standard feedforward output layer.
The Stacked LSTM is an extension to this model that has multiple hidden LSTM layers where each layer contains multiple memory cells.
In this post, you will discover the Stacked LSTM model architecture.
After completing this tutorial, you will know:
- The benefit of deep neural network architectures.
- The Stacked LSTM recurrent neural network architecture.
- How to implement stacked LSTMs in Python with Keras.
Kick-start your project with my new book Long Short-Term Memory Networks With Python, including step-by-step tutorials and the Python source code files for all examples.
Let’s get started.
Overview
This post is divided into 3 parts, they are:
- Why Increase Depth?
- Stacked LSTM Architecture
- Implement Stacked LSTMs in Keras
Why Increase Depth?
Stacking LSTM hidden
To finish reading, please visit source site