Machine Learning

Epoch

One complete pass through the entire training dataset during model training. Models typically require multiple epochs to learn effectively, with each pass refining the model's understanding.

Why It Matters

Too few epochs means an undertrained model; too many can cause overfitting. Finding the right number is key to efficient training.

Example

Training a model for 50 epochs means the model has seen every example in the training set 50 times, each time refining its weights slightly.

Think of it like...

Like re-reading a textbook chapter multiple times — each pass helps you pick up details and connections you missed before.

Related Terms