Get the Most out of LSTMs on Your Sequence Prediction Problem

Last Updated on August 14, 2019 Long Short-Term Memory (LSTM) Recurrent Neural Networks are a powerful type of deep learning suited for sequence prediction problems. A possible concern when using LSTMs is if the added complexity of the model is improving the skill of your model or is in fact resulting in lower skill than simpler models. In this post, you will discover simple experiments you can run to ensure you are getting the most out of LSTMs on your […]

Read more

Multivariate Time Series Forecasting with LSTMs in Keras

Last Updated on August 28, 2020 Neural networks like Long Short-Term Memory (LSTM) recurrent neural networks are able to almost seamlessly model problems with multiple input variables. This is a great benefit in time series forecasting, where classical linear methods can be difficult to adapt to multivariate or multiple input forecasting problems. In this tutorial, you will discover how you can develop an LSTM model for multivariate time series forecasting with the Keras deep learning library. After completing this tutorial, […]

Read more

Mini-Course on Long Short-Term Memory Recurrent Neural Networks with Keras

Last Updated on August 14, 2019 Long Short-Term Memory (LSTM) recurrent neural networks are one of the most interesting types of deep learning at the moment. They have been used to demonstrate world-class results in complex problem domains such as language translation, automatic image captioning, and text generation. LSTMs are different to multilayer Perceptrons and convolutional neural networks in that they are designed specifically for sequence prediction problems. In this mini-course, you will discover how you can quickly bring LSTM […]

Read more

Stacked Long Short-Term Memory Networks

Last Updated on August 14, 2019 Gentle introduction to the Stacked LSTMwith example code in Python. The original LSTM model is comprised of a single hidden LSTM layer followed by a standard feedforward output layer. The Stacked LSTM is an extension to this model that has multiple hidden LSTM layers where each layer contains multiple memory cells. In this post, you will discover the Stacked LSTM model architecture. After completing this tutorial, you will know: The benefit of deep neural […]

Read more

CNN Long Short-Term Memory Networks

Last Updated on August 14, 2019 Gentle introduction to CNN LSTM recurrent neural networkswith example Python code. Input with spatial structure, like images, cannot be modeled easily with the standard Vanilla LSTM. The CNN Long Short-Term Memory Network or CNN LSTM for short is an LSTM architecture specifically designed for sequence prediction problems with spatial inputs, like images or videos. In this post, you will discover the CNN LSTM architecture for sequence prediction. After completing this post, you will know: […]

Read more

Encoder-Decoder Long Short-Term Memory Networks

Last Updated on August 14, 2019 Gentle introduction to the Encoder-Decoder LSTMs forsequence-to-sequence prediction with example Python code. The Encoder-Decoder LSTM is a recurrent neural network designed to address sequence-to-sequence problems, sometimes called seq2seq. Sequence-to-sequence prediction problems are challenging because the number of items in the input and output sequences can vary. For example, text translation and learning to execute programs are examples of seq2seq problems. In this post, you will discover the Encoder-Decoder LSTM architecture for sequence-to-sequence prediction. After […]

Read more

Gentle Introduction to Generative Long Short-Term Memory Networks

Last Updated on August 14, 2019 The Long Short-Term Memory recurrent neural network was developed for sequence prediction. In addition to sequence prediction problems. LSTMs can also be used as a generative model In this post, you will discover how LSTMs can be used as generative models. After completing this post, you will know: About generative models, with a focus on generative models for text called language modeling. Examples of applications where LSTM Generative models have been used. Examples of […]

Read more

How to Make Predictions with Long Short-Term Memory Models in Keras

Last Updated on August 14, 2019 The goal of developing an LSTM model is a final model that you can use on your sequence prediction problem. In this post, you will discover how to finalize your model and use it to make predictions on new data. After completing this post, you will know: How to train a final LSTM model. How to save your final LSTM model, and later load it again. How to make predictions on new data. Kick-start […]

Read more

How to Reshape Input Data for Long Short-Term Memory Networks in Keras

Last Updated on August 14, 2019 It can be difficult to understand how to prepare your sequence data for input to an LSTM model. Often there is confusion around how to define the input layer for the LSTM model. There is also confusion about how to convert your sequence data that may be a 1D or 2D matrix of numbers to the required 3D format of the LSTM input layer. In this tutorial, you will discover how to define the […]

Read more

How to Diagnose Overfitting and Underfitting of LSTM Models

Last Updated on January 8, 2020 It can be difficult to determine whether your Long Short-Term Memory model is performing well on your sequence prediction problem. You may be getting a good model skill score, but it is important to know whether your model is a good fit for your data or if it is underfit or overfit and could do better with a different configuration. In this tutorial, you will discover how you can diagnose the fit of your […]

Read more
1 802 803 804 805 806 905