Artificial Intelligence

Compute

The computational resources (processing power, memory, time) required to train or run AI models. Compute is measured in FLOPs (floating-point operations) and is a primary constraint and cost in AI development.

Why It Matters

Compute is the fundamental currency of AI progress. Access to compute determines who can build frontier models and at what cost.

Example

Training GPT-4 is estimated to have required ~$100 million in compute costs, using thousands of GPUs running for months.

Think of it like...

Like the fuel needed for a rocket launch — bigger rockets (models) need exponentially more fuel (compute), and the cost is a major constraint on what you can build.

Related Terms