Skip to content
Technical GlossaryDeep Learning

Exploding Gradients

An optimization problem in which gradients grow excessively during backpropagation and destabilize training.

Exploding gradients arise particularly in very deep or long-sequence networks when derivatives grow excessively during backpropagation. This makes parameter updates uncontrolled and can lead to numerically unstable training. Techniques such as learning-rate control, better initialization, and gradient clipping are commonly used to mitigate the issue.