What do we mean by Epoch, Batch, Iterations in a Neural Network?

What do we mean by Epoch, Batch, Iterations in a Neural Network?

#Epoch #Batch #Iterations #neuralnetworks

Epoch

In real-time we have very large datasets which we cannot feed to the computer all at once. This means the entire Dataset is to be divided in order to feed to the Neural Network. Epoch is when the complete dataset is passed forward and backward through the Neural Network only once.

Batch

Epoch is too big to feed the computer at once. Hence, we need to divide them into several parts which are called Batches. The dataset is passed to the same Neural Network multiple times. One Epoch in the Neural Network leads to the underfitting of the curve, to avoid this problem we have to increase the number of Epochs. As the number of Epochs increases, a greater number of times the weights are changed in Neural Network and the curve goes from underfitting to optimal fitting to overfitting. We cannot specify the value for the number of Epochs, it is related to how diverse the dataset is. Batch size is the total number of training samples present in one Batch. Batch size is different from the Number of Batches.

Iterations

Iterations are the number of batches needed or required to complete one Epoch. The number of Batches is equal to the number of iterations for one Epoch.

No alt text provided for this image

Let’s say we have 1000 training examples that we are going to use.

We can divide the dataset of 1000 examples into batches of 250 then it will take 4 iterations to complete 1 epoch.





要查看或添加评论,请登录

Moad H.的更多文章

社区洞察

其他会员也浏览了