- Print
- DarkLight
- PDF
Article summary
Did you find this summary helpful?
Thank you for your feedback
An epoch represents one complete pass of the entire training dataset through the AI model during the training process.
In each epoch, the model sees and learns from every example in the training data once.
The number of epochs is a hyperparameter that you set before training.
It determines how many times the model will iterate over the entire dataset.
Too few epochs can lead to underfitting, where the model doesn't learn enough from the data.
Too many epochs can lead to overfitting, where the model memorizes the training data too well and performs poorly on new, unseen data.
Was this article helpful?