# Knowledge Distillation in Vision

> Source: https://sukruyusufkaya.com/en/glossary/knowledge-distillation-in-vision
> Updated: 2026-05-13T20:57:37.375Z
> Type: glossary
> Category: bilgisayarli-goru
**TLDR:** An approach for transferring knowledge from a large, powerful vision model into a smaller and more efficient one.

<p>Knowledge distillation is used to compress high-performing but heavy vision models into lighter systems. The student model learns not only from hard labels but also from the teacher model’s softer decision structure. This makes it possible to deploy more suitable models in mobile devices, edge systems, and low-latency production environments. It is a critical family of techniques for balancing performance and efficiency.</p>