The Promise of Recurrent Neural Networks for Time Series Forecasting
Last Updated on August 5, 2019
Recurrent neural networks are a type of neural network that add the explicit handling of order in input observations.
This capability suggests that the promise of recurrent neural networks is to learn the temporal context of input sequences in order to make better predictions. That is, that the suite of lagged observations required to make a prediction no longer must be diagnosed and specified as in traditional time series forecasting, or even forecasting with classical neural networks. Instead, the temporal dependence can be learned, and perhaps changes to this dependence can also be learned.
In this post, you will discover the promised capability of recurrent neural networks for time series forecasting. After reading this post, you will know:
- The focus and implicit, if not explicit, limitations on traditional time series forecasting methods.
- The capabilities provided in using traditional feed-forward neural networks for time series forecasting.
- The additional promise that recurrent neural networks make on top of traditional neural nets and hints of what this may mean in practice.
Kick-start your project with my new book Deep Learning for Time Series Forecasting, including step-by-step tutorials and the Python source code files for
To finish reading, please visit source site