# AWQ Algorithm: Activation-Aware Salient Channel Scaling — Respecting Outliers

> Source: https://sukruyusufkaya.com/en/learn/fine-tuning-cookbook/ftc-awq-activation-aware-quantization
> Updated: 2026-05-14T14:42:57.102Z
> Category: Fine-Tuning Cookbook (Model-by-Model)
> Module: Part X — Quantization Engineering
**TLDR:** AWQ (Lin et al. 2023) — activation-aware alternative to GPTQ. 'Salient channel scaling' technique that protects activation outliers. Quantize Llama 3.1 8B in 8 min on RTX 4090 via autoawq, slightly better WikiText-2 PPL than GPTQ + easier vLLM serving.

