Tin Rabzelj
Tin Rabzelj
Dashed Line

MoE

Mixture of Experts is a neural network architecture that uses multiple specialized sub-networks (experts) and a gating mechanism to route inputs to the most relevant experts, enabling scalable model capacity while maintaining computational efficiency.

See all tags.