A Gentle Introduction to Backpropagation Through Time
Last Updated on August 14, 2020
Backpropagation Through Time, or BPTT, is the training algorithm used to update weights in recurrent neural networks like LSTMs.
To effectively frame sequence prediction problems for recurrent neural networks, you must have a strong conceptual understanding of what Backpropagation Through Time is doing and how configurable variations like Truncated Backpropagation Through Time will affect the skill, stability, and speed when training your network.In this post, you will get a gentle introduction to Backpropagation Through Time intended for the practitioner (no equations!).
In this post, you will get a gentle introduction to Backpropagation Through Time intended for the practitioner (no equations!).
After reading this post, you will know:
- What Backpropagation Through Time is and how it relates to the Backpropagation training algorithm used by Multilayer Perceptron networks.
- The motivations that lead to the need for Truncated Backpropagation Through Time, the most widely used variant in deep learning for training LSTMs.
- A notation for thinking about how to configure Truncated Backpropagation Through Time and the canonical configurations used in research and by deep learning libraries.
Kick-start your project with my new book Long Short-Term Memory Networks With Python, including step-by-step tutorials and the
To finish reading, please visit source site