Machine Learning

Random Forest

An ensemble learning method that builds multiple decision trees during training and outputs the majority vote (classification) or average prediction (regression) of all the trees. The 'forest' of diverse trees is more robust than any single tree.

Why It Matters

Random forests are one of the most reliable and interpretable ML algorithms. They work well out of the box and are widely used in production for tabular data.

Example

A bank using a random forest of 500 decision trees to evaluate loan applications — each tree considers different factors, and the majority vote determines approval.

Think of it like...

Like asking 500 different experts for their opinion and going with the majority answer — individual experts might be wrong, but the crowd is usually right.

Related Terms