Epoch
One complete pass through the entire training dataset during model training. Models typically require multiple epochs to learn effectively, with each pass refining the model's understanding.
Why It Matters
Too few epochs means an undertrained model; too many can cause overfitting. Finding the right number is key to efficient training.
Example
Training a model for 50 epochs means the model has seen every example in the training set 50 times, each time refining its weights slightly.
Think of it like...
Like re-reading a textbook chapter multiple times — each pass helps you pick up details and connections you missed before.
Related Terms
Batch Size
The number of training examples processed together before the model updates its parameters. Batch size affects training speed, memory usage, and how smoothly the model learns.
Learning Rate
A hyperparameter that controls how much the model's weights are adjusted in response to errors during each training step. It determines the size of the steps taken during gradient descent optimization.
Training Data
The dataset used to teach a machine learning model. It contains examples (and often labels) that the model learns patterns from during the training process. The quality and quantity of training data directly impact model performance.
Overfitting
When a model learns the training data too well — including its noise and random fluctuations — and performs poorly on new, unseen data. The model essentially memorizes rather than generalizes.
Early Stopping
A regularization technique where training is halted when the model's performance on validation data stops improving, even if training loss continues to decrease. It prevents overfitting by finding the optimal training duration.