Skip to content
MEVZU N°124ISTANBUL
Glossary · Advanced · 2017

Mixture of Experts (MoE)

An architecture where only a subset of expert sub-networks activates per token, combining huge capacity with cheaper inference.

EN — English term
Mixture of Experts (MoE)
TR — Turkish term
Uzmanlar Karışımı (MoE)