Artificial Intelligence

Model Size

The number of parameters in a model, typically expressed in millions (M) or billions (B). Model size correlates loosely with capability but also determines compute and memory requirements.

Why It Matters

Model size is one of the first things practitioners evaluate when choosing a model. It determines hardware requirements, inference cost, and rough capability expectations.

Example

Llama 3 comes in 8B, 70B, and 405B parameter versions — each step up increases capability but also quadruples memory requirements and inference cost.

Think of it like...

Like engine displacement in cars — bigger engines generally mean more power, but also more fuel consumption, weight, and cost.

Related Terms