Machine Learning

Dropout

A regularization technique where random neurons are temporarily disabled (dropped out) during each training step. This forces the network to not rely too heavily on any single neuron and builds redundancy.

Why It Matters

Dropout is one of the most effective and widely used regularization techniques. It significantly reduces overfitting with minimal computational cost.

Example

During training with 50% dropout, half the neurons in each layer are randomly turned off at each step. At inference time, all neurons are active.

Think of it like...

Like a sports team practicing with random players sitting out each drill — every player learns to contribute, and the team does not collapse if one person is unavailable.

Related Terms