# ALiBi: Attention with Linear Biases — Press 2021's Simple Solution and Extrapolation Advantage

> Source: https://sukruyusufkaya.com/en/learn/llm-muhendisligi/alibi-attention-linear-biases-press-2021
> Updated: 2026-05-13T13:00:28.086Z
> Category: LLM Mühendisliği
> Module: Module 9: Position Encoding — Order-Embedded Meaning
**TLDR:** ALiBi (Press 2021): inject position information by adding linear bias to attention score without position embedding. Math: attention[i,j] += m × (j-i). Per-head slopes hierarchy (m_h = 2^{-8h/H}). Strengths: zero parameters, train-short eval-long extrapolation, simple implementation. Comparison with RoPE, Mistral and BLOOM usage.

