How to Prepare Sequence Prediction for Truncated BPTT in Keras
Last Updated on August 14, 2019
Recurrent neural networks are able to learn the temporal dependence across multiple timesteps in sequence prediction problems.
Modern recurrent neural networks like the Long Short-Term Memory, or LSTM, network are trained with a variation of the Backpropagation algorithm called Backpropagation Through Time. This algorithm has been modified further for efficiency on sequence prediction problems with very long sequences and is called Truncated Backpropagation Through Time.
An important configuration parameter when training recurrent neural networks like LSTMs using Truncated Backpropagation Through Time is deciding how many timesteps to use as input. That is, how exactly to split up your very long input sequences into subsequences in order to get the best performance.
In this post, you will discover 6 different ways you can split up very long input sequences to effectively train recurrent neural networks using Truncated Backpropagation Through Time in Python with Keras.
After reading this post, you will know:
- What Truncated Backpropagation Through Time is and how it has been implemented in the Python deep learning library Keras.
- How exactly the choice of the number of input timesteps affects learning within recurrent neural networks.
- 6 different techniques you can use
To finish reading, please visit source site