Technical GlossaryDeep Learning
Rotary Positional Embedding
A modern positional representation method that encodes order information through rotations in vector space.
Rotary positional embedding integrates positional information into attention computations in a geometric way. This can be advantageous for Transformer models that must generalize across long contexts. Its growing use in large language models is driven by the fact that it can behave more flexibly than fixed or learned positional encodings.
You Might Also Like
Explore these concepts to continue your artificial intelligence journey.
