Technical GlossaryDeep Learning
Additive Attention
An early attention approach that compares query and context representations through a learnable combination function.
Additive attention is an important attention mechanism used particularly in early seq2seq architectures. Instead of dot products, it produces scores through learnable linear transformations. In some settings, this can provide more flexible alignment behavior. It is also historically important for understanding the evolution of attention in modern deep learning.
You Might Also Like
Explore these concepts to continue your artificial intelligence journey.
