# Mixture of Experts

> Source: https://sukruyusufkaya.com/en/glossary/mixture-of-experts
> Updated: 2026-05-13T21:12:10.530Z
> Type: glossary
> Category: uretken-yapay-zeka-ve-llm
**TLDR:** An approach in which only relevant expert subnetworks are activated for each input to achieve scale and efficiency.

<p>Mixture of Experts makes it possible to scale to very large model capacity without using every parameter at every step. This architecture can keep active parameter usage low while maintaining high total knowledge capacity. However, routing stability, expert balance, and training complexity are central challenges of the approach.</p>