Machine Learning

Bayesian Optimization

A sequential optimization strategy for finding the best hyperparameters by building a probabilistic model of the objective function and using it to select the most promising configurations to evaluate.

Why It Matters

Bayesian optimization finds good hyperparameters in far fewer evaluations than grid or random search, saving significant compute cost and time.

Example

Finding optimal learning rate, dropout rate, and layer count for a model in 30 evaluations instead of the 1,000+ needed for grid search.

Think of it like...

Like a smart treasure hunter who uses clues from previous digs to decide where to search next, rather than digging randomly — each attempt informs the next.

Related Terms