Machine Learning

Hyperparameter

Settings that are configured before training begins and control how the model learns, as opposed to parameters which are learned during training. Examples include learning rate, batch size, and number of layers.

Why It Matters

Proper hyperparameter tuning can improve model performance by 10-30% without changing architecture or data. It is one of the highest-ROI activities in ML development.

Example

Setting the learning rate to 0.001, batch size to 32, and number of training epochs to 100 before starting to train a neural network.

Think of it like...

Like the settings on an oven before baking — temperature, time, and rack position are set before you start, and they dramatically affect the final result.

Related Terms