Dropout
A regularization technique where random neurons are temporarily disabled (dropped out) during each training step. This forces the network to not rely too heavily on any single neuron and builds redundancy.
Why It Matters
Dropout is one of the most effective and widely used regularization techniques. It significantly reduces overfitting with minimal computational cost.
Example
During training with 50% dropout, half the neurons in each layer are randomly turned off at each step. At inference time, all neurons are active.
Think of it like...
Like a sports team practicing with random players sitting out each drill — every player learns to contribute, and the team does not collapse if one person is unavailable.
Related Terms
Regularization
Techniques used to prevent overfitting by adding constraints or penalties to the model during training. Regularization discourages the model from becoming too complex or fitting noise in the training data.
Overfitting
When a model learns the training data too well — including its noise and random fluctuations — and performs poorly on new, unseen data. The model essentially memorizes rather than generalizes.
Batch Normalization
A technique that normalizes the inputs to each layer in a neural network by adjusting and scaling them to have zero mean and unit variance. This stabilizes and accelerates the training process.
Neural Network
A computing system inspired by the biological neural networks in the human brain. It consists of interconnected nodes (neurons) organized in layers that process information and learn to recognize patterns.