Skip to content
Technical GlossaryDeep Learning

Backpropagation

The core learning mechanism that propagates loss gradients backward through layers to update weights.

Backpropagation is the central optimization mechanism that makes deep learning possible. It propagates the output error backward through the layers so that the contribution of each parameter to the overall loss can be computed. This allows even networks with millions of parameters to learn systematically. The method is theoretically grounded in the chain rule and is made practical through modern automatic differentiation systems.