Machine Learning

Batch Normalization

A technique that normalizes the inputs to each layer in a neural network by adjusting and scaling them to have zero mean and unit variance. This stabilizes and accelerates the training process.

Why It Matters

Batch normalization lets you use higher learning rates and train deeper networks more reliably. It is a standard component in modern neural network architectures.

Example

After each layer processes a batch of data, batch normalization adjusts the outputs so they are centered around zero with consistent spread before passing to the next layer.

Think of it like...

Like a teacher who curves each exam so grades are comparable — it ensures each layer of the network receives well-calibrated input regardless of what happened before.

Related Terms