Difference Between a Batch and an Epoch in a Neural Network
Last Updated on October 26, 2019
Stochastic gradient descent is a learning algorithm that has a number of hyperparameters.
Two hyperparameters that often confuse beginners are the batch size and number of epochs. They are both integer values and seem to do the same thing.
In this post, you will discover the difference between batches and epochs in stochastic gradient descent.
After reading this post, you will know:
- Stochastic gradient descent is an iterative learning algorithm that uses a training dataset to update a model.
- The batch size is a hyperparameter of gradient descent that controls the number of training samples to work through before the model’s internal parameters are updated.
- The number of epochs is a hyperparameter of gradient descent that controls the number of complete passes through the training dataset.
Kick-start your project with my new book Deep Learning With Python, including step-by-step tutorials and the Python source code files for all examples.
Let’s get started.