A Gentle Introduction to Exploding Gradients in Neural Networks
Last Updated on August 14, 2019
Exploding gradients are a problem where large error gradients accumulate and result in very large updates to neural network model weights during training.
This has the effect of your model being unstable and unable to learn from your training data.
In this post, you will discover the problem of exploding gradients with deep artificial neural networks.
After completing this post, you will know:
- What exploding gradients are and the problems they cause during training.
- How to know whether you may have exploding gradients with your network model.
- How you can fix the exploding gradient problem with your network.
Kick-start your project with my new book Long Short-Term Memory Networks With Python, including step-by-step tutorials and the Python source code files for all examples.
Let’s get started.
- Update Oct/2018: Removed mention of ReLU as a solution.
What Are Exploding Gradients?
An error gradient is
To finish reading, please visit source site