Skip to content →

The Confusion: What are Batch Size and Epoch in Neural Networks?

When setting the epoch and batch size, you must have times thinking why do I need to set these two values, why can’t just epoch be enough, do I need to set batch size again and what does it do?

Well, for beginners, the batch size and epoch in neural networks may sound very confusing. Since both of them do a very similar job in training models.

To understand the differences between batch size and epoch, we first need to understand what Gradient Descent is, and how it optimizes the weights of internal model parameters.

But if you don’t want to go deeper then for you in short, a batch size means how many samples from the dataset you want to pass through the network before you want to update the weights of internal parameters of the network. The value of batch size can be anything less than the total dataset size.

The epoch means how many times the entire dataset should be passed through the network. An epoch value can be anything from one up to infinity. This means you can train a model with the same dataset as many times you want.

Published in Machine learning Neural Network Tutorials

2 Comments

  1. Akiba Amrin Akiba Amrin

    This really helped. Thank you!

Leave a Reply