Machine Learning

Weight

A numerical parameter in a neural network that is learned during training. Weights determine the strength of connections between neurons and collectively encode the model's knowledge.

Why It Matters

The weights are literally what the model has learned — a GPT-4 class model has hundreds of billions of weights that together encode its capabilities.

Example

A 7B parameter model like Llama 2 7B has 7 billion individual weight values, each one adjusted during training to minimize prediction errors.

Think of it like...

Like the strength settings on a mixing board — each slider (weight) controls how much one signal influences the final output, and training adjusts all the sliders.

Related Terms