Dropout Regularization in Deep Learning Models With Keras
Last Updated on August 27, 2020
A simple and powerful regularization technique for neural networks and deep learning models is dropout.
In this post you will discover the dropout regularization technique and how to apply it to your models in Python with Keras.
After reading this post you will know:
- How the dropout regularization technique works.
- How to use dropout on your input layers.
- How to use dropout on your hidden layers.
- How to tune the dropout level on your problem.
Kick-start your project with my new book Deep Learning With Python, including step-by-step tutorials and the Python source code files for all examples.
Let’s get started.
- Update Oct/2016: Updated for Keras 1.1.0, TensorFlow 0.10.0 and scikit-learn v0.18.
- Update Mar/2017: Updated for Keras 2.0.2, TensorFlow 1.0.1 and Theano 0.9.0.
- Update Sep/2019: Updated for Keras 2.2.5 API.
Dropout Regularization For Neural Networks
Dropout is a regularization technique for neural network models proposed by
To finish reading, please visit source site