# Exploding Gradients

> Source: https://sukruyusufkaya.com/en/glossary/exploding-gradients
> Updated: 2026-05-13T19:58:57.931Z
> Type: glossary
> Category: derin-ogrenme
**TLDR:** An optimization problem in which gradients grow excessively during backpropagation and destabilize training.

<p>Exploding gradients arise particularly in very deep or long-sequence networks when derivatives grow excessively during backpropagation. This makes parameter updates uncontrolled and can lead to numerically unstable training. Techniques such as learning-rate control, better initialization, and gradient clipping are commonly used to mitigate the issue.</p>