Artificial Intelligence

Black Box

A model or system whose internal workings are not visible or understandable to the user — you can see the inputs and outputs but not the reasoning in between. Most deep learning models are considered black boxes.

Why It Matters

The black box nature of AI creates trust, regulatory, and debugging challenges. Industries like healthcare and finance are pushing for more transparent alternatives.

Example

A deep neural network with millions of parameters that accurately predicts cancer risk but cannot explain which specific factors drove a particular patient's risk score.

Think of it like...

Like a vending machine — you put in money and a selection, something happens inside you cannot see, and a product comes out. You know what it does but not how.

Related Terms