# Long Context Extrapolation: NTK-Aware Scaling + YaRN + LongRoPE — Journey from 8K to 1M Tokens

> Source: https://sukruyusufkaya.com/en/learn/llm-muhendisligi/long-context-ntk-aware-yarn-longrope-extrapolation
> Updated: 2026-05-13T13:00:28.174Z
> Category: LLM Mühendisliği
> Module: Module 9: Position Encoding — Order-Embedded Meaning
**TLDR:** Extending RoPE to long context: NTK-aware scaling intuition, YaRN (Peng 2023) — comprehensive solution + temperature scaling, LongRoPE (Microsoft 2024) — 2M token context. Llama-3-8B base 8K → 128K extension recipes, Gemini 1.5 1M token tricks, fine-tune protocol.

