Difference Between a Batch and an Epoch in a Neural Network
Last Updated on October 26, 2019 Stochastic gradient descent is a learning algorithm that has a number of hyperparameters. Two hyperparameters that often confuse beginners are the batch size and number of epochs. They are both integer values and seem to do the same thing. In this post, you will discover the difference between batches and epochs in stochastic gradient descent. After reading this post, you will know: Stochastic gradient descent is an iterative learning algorithm that uses a training […]
Read more