Stochastic
Involving randomness or probability. In ML, stochastic processes include random weight initialization, stochastic gradient descent, and probabilistic sampling during text generation.
Why It Matters
Understanding stochasticity explains why LLMs give different answers each time and why training runs produce different results — randomness is a feature, not a bug.
Example
Running the same training twice with different random seeds produces different models with slightly different performance, because initialization and batch ordering are random.
Think of it like...
Like shuffling a deck of cards — the process has randomness built in, so you get a different arrangement each time, even following the same procedure.
Related Terms
Deterministic Output
When an AI model produces the same output every time for the same input. Achieved by setting temperature to 0 and using fixed random seeds.
Temperature
A parameter that controls the randomness or creativity of an LLM's output. Lower temperatures (closer to 0) make outputs more deterministic and focused; higher temperatures increase randomness and creativity.
Stochastic Gradient Descent
A variant of gradient descent that updates model parameters using a single random training example (or small batch) at each step instead of the entire dataset. It is faster and can escape local minima.