Artificial Intelligence

Embedding Dimension

The number of numerical values in a vector embedding. Higher dimensions can capture more nuanced relationships but require more storage and computation.

Why It Matters

Embedding dimension is a key architecture choice — too few dimensions lose information, too many waste resources. Common dimensions range from 384 to 3072.

Example

OpenAI's text-embedding-3-small uses 1536 dimensions, while text-embedding-3-large uses 3072 — more dimensions for more nuanced semantic understanding.

Think of it like...

Like the resolution of a photograph — more pixels capture finer detail, but the file gets bigger and takes longer to process.

Related Terms