Technical GlossaryDeep Learning
Backpropagation Through Time
A method in sequential models where the network is unrolled across time steps and gradients are computed backward.
Backpropagation Through Time is an extended form of backpropagation used especially to train RNN-based models. The network is unrolled across time, and each time step’s contribution to the loss is traced backward into the past. While this makes learning long dependencies theoretically possible, it also introduces major challenges such as vanishing and exploding gradients. Architectures like LSTM and GRU were largely developed to mitigate these issues.
You Might Also Like
Explore these concepts to continue your artificial intelligence journey.
