Artificial Intelligence

Model Monitoring

The practice of continuously tracking an ML model's performance, predictions, and input data in production to detect degradation, drift, or anomalies after deployment.

Why It Matters

Without monitoring, models silently degrade over time as the world changes. What worked last year may be making terrible predictions today without anyone knowing.

Example

Dashboards tracking a fraud model's precision and recall daily, alerting the team when precision drops below 85% — indicating the model needs retraining.

Think of it like...

Like a patient wearing a heart monitor after surgery — continuous tracking catches problems early before they become emergencies.

Related Terms