Technical GlossaryMathematics, Statistics and Optimization
Backpropagation
The core algorithm in neural networks that computes gradients by propagating error backward through layers.
Backpropagation is the core algorithm that makes deep learning practical. The error produced by the network is propagated backward from the output layer, allowing the contribution of each parameter to that error to be computed. This process relies on the chain rule and makes it possible to update large numbers of parameters efficiently. Without backpropagation, training large neural networks at modern scale would be nearly impossible. For that reason, it is one of the invisible but vital engines of modern AI systems.
You Might Also Like
Explore these concepts to continue your artificial intelligence journey.
