Artificial Intelligence

Attention Map

A visualization showing which parts of the input an AI model focuses on when making predictions. Attention maps reveal the model's internal focus patterns.

Why It Matters

Attention maps provide interpretability for transformer models, showing whether the model is looking at the right things when making decisions.

Example

A vision transformer's attention map highlighting the dog in an image when classifying it as 'dog,' showing the model focused on the animal and not the background.

Think of it like...

Like eye-tracking studies that show where a person looks on a webpage — attention maps show where the AI 'looks' when processing information.

Related Terms