# Gemma 3 1B / 4B / 12B / 27B: Google's 256K Vocab + Multimodal (4B+)

> Source: https://sukruyusufkaya.com/en/learn/fine-tuning-cookbook/ftc-gemma-3-1b-4b-12b-27b-recipe
> Updated: 2026-05-14T14:42:51.686Z
> Category: Fine-Tuning Cookbook (Model-by-Model)
> Module: Part III — Small Open Models (1B–8B)
**TLDR:** Gemma 3 — Google's 2025 open models. 256K vocab, 4B+ multimodal (SigLIP vision tower), GeGLU, RMSNorm, 128K context, ShieldGemma. Gemma 3 4B/12B QLoRA on RTX 4090. No system role (prepend to user), Gemma 3 ToS attention.

