# Numerical Stability: Log-Sum-Exp, FP16 Pitfalls, NaN Hunting — Hidden Hours of LLM Training

> Source: https://sukruyusufkaya.com/en/learn/llm-muhendisligi/numerik-stabilite-fp16-bf16-fp8-nan
> Updated: 2026-05-13T13:00:23.372Z
> Category: LLM Mühendisliği
> Module: Module 1: The AI Engineer's Mathematical Arsenal
**TLDR:** Floating point representations (FP32, FP16, BF16, FP8), overflow/underflow/NaN hunting, log-sum-exp trick, softmax numerical stability, mixed precision training (autocast + GradScaler), numerical roots of pretrain loss spikes.

