# DeepSeek-Coder-V2 16B / 236B: MoE Code Model + Multi-File Context

> Source: https://sukruyusufkaya.com/en/learn/fine-tuning-cookbook/ftc-deepseek-coder-v2-moe-code
> Updated: 2026-05-14T14:42:55.494Z
> Category: Fine-Tuning Cookbook (Model-by-Model)
> Module: Part VIII — Code Models & Repo-Level FT
**TLDR:** DeepSeek-Coder-V2 (DeepSeek 2024) — MoE arch (16B / 236B), one of strongest open code LLMs with Apache 2.0. 338 programming languages, 128K context, multi-file repo understanding. 16B (2.4B active) QLoRA possible on RTX 4090; 236B cloud only.

