Machine Learning

Confusion Matrix

A table that summarizes the performance of a classification model by showing true positives, true negatives, false positives, and false negatives. It reveals the types of errors a model makes.

Why It Matters

The confusion matrix tells you not just how often the model is wrong but how it is wrong — information that accuracy alone cannot provide.

Example

A medical test confusion matrix showing: 90 true positives (correctly detected disease), 5 false negatives (missed disease), 8 false positives (false alarms), 897 true negatives.

Think of it like...

Like a scorecard that shows not just how many games a team won but exactly which opponents they beat and lost to — it reveals patterns in performance.

Related Terms