Machine Learning

Exploding Gradient

A training problem where gradients become extremely large during backpropagation, causing weight updates to be so drastic that the model becomes unstable and training diverges.

Why It Matters

Exploding gradients can completely derail model training. Gradient clipping is the standard solution and is applied by default in most modern frameworks.

Example

During training, the loss suddenly spikes to NaN (not a number) because a gradient grew to an enormous value and caused weights to overflow.

Think of it like...

Like a microphone feedback loop — a small signal gets amplified over and over until it becomes an ear-splitting screech that destroys the signal entirely.

Related Terms