# Continual Pre-training TR: Catastrophic Forgetting Mitigation + Replay Buffer

> Source: https://sukruyusufkaya.com/en/learn/fine-tuning-cookbook/ftc-tr-continual-pretraining-replay
> Updated: 2026-05-14T14:42:56.308Z
> Category: Fine-Tuning Cookbook (Model-by-Model)
> Module: Part IX — Turkish-First & Localization Engineering
**TLDR:** Main risk of continual pre-train: forgetting English while learning TR. Replay buffer (10-15% EN per batch), LR warmup, why LR should be 1/10-1/50 of original pre-train. 2B token TR continual PT on Llama 8B feasible in 24h on RTX 4090.

