# Sequence Parallel + Context Parallel: Ulysses + Ring Attention + 1M Context

> Source: https://sukruyusufkaya.com/en/learn/fine-tuning-cookbook/ftc-sequence-context-parallel-ulysses-ring
> Updated: 2026-05-14T14:42:52.666Z
> Category: Fine-Tuning Cookbook (Model-by-Model)
> Module: Part IV — Mid-Large Models (13B-70B+) + Distributed Internals
**TLDR:** Breaking long-context FT's physics limit: split sequence/context across GPUs. DeepSpeed-Ulysses (sequence parallel — head-wise), Ring Attention (Berkeley), Megatron Sequence Parallel. Enable 1M context. Technical foundation of Kimi-1.5's (Moonshot) 2M context recipe.

