Machine Learning

Gradient Boosting

An ensemble technique that builds models sequentially, where each new model focuses on correcting the errors made by previous models. It combines many weak learners into a single strong learner.

Why It Matters

Gradient boosting (XGBoost, LightGBM, CatBoost) consistently wins ML competitions on tabular data and is the go-to algorithm for many production applications.

Example

XGBoost predicting customer churn by building 1000 small decision trees sequentially, each one focusing on the customers that previous trees got wrong.

Think of it like...

Like a relay team where each runner focuses on making up for the previous runner's weak spots — together they achieve a better overall time than any single runner.

Related Terms