# AdamW + Learning Rate Schedule: Mathematical Anatomy of Modern LLM Optimization

> Source: https://sukruyusufkaya.com/en/learn/llm-muhendisligi/adamw-optimizer-learning-rate-schedule
> Updated: 2026-05-13T13:00:28.705Z
> Category: LLM Mühendisliği
> Module: Module 11: Pre-training Dynamics + Optimizer Math
**TLDR:** Modern LLM optimization: evolution from SGD to Adam to AdamW. Loshchilov 2019 weight decay decoupling. Momentum (β1=0.9) + variance estimate (β2=0.95) intuition. Learning rate schedules: cosine decay, linear decay, warmup needed. Gradient clipping, mixed precision training, hyperparameter pitfalls.

