Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Expert Divergence Learning for MoE-based Language Models

About

The Mixture-of-Experts (MoE) architecture is a powerful technique for scaling language models, yet it often suffers from expert homogenization, where experts learn redundant functionalities, thereby limiting MoE's full potential. To address this, we introduce Expert Divergence Learning, a novel pre-training strategy that explicitly encourages functional specialization among experts. Our method incorporates a label-driven auxiliary loss that leverages domain labels inherent in pre-training corpora to maximize the Jensen-Shannon Divergence between the expert routing distributions of different data domains. This optimization objective guides the model to develop diverged routing policies for varied domains and closer routing policies for the same domain, which leads to emergent and organized expert specialization. We validate our approach by pre-training MoE models of up to 15 billion parameters from scratch. Experimental results demonstrate that models trained with Expert Divergence Learning not only achieve a lower language modeling loss but also exhibit significant performance improvements across a diverse range of downstream benchmarks. Further analysis confirms that our method effectively mitigates expert homogenization and brings greater functional specialization, all with negligible computational overhead during training.

Jiaang Li, Haibin Chen, Langming Liu, Yujin Yuan, Yadao Wang, Yizhen Zhang, Chengting Yu, Xin Tong, Weidong Zhang, Shilei Liu, Wenbo Su, Bo Zheng• 2026

Related benchmarks

TaskDatasetResultRank
Question AnsweringARC-E
Accuracy60.85
416
Question AnsweringARC-C
Accuracy0.3525
87
Language UnderstandingMMLU
MMLU Score33.21
70
Language UnderstandingCMMLU
Accuracy36.58
42
Reading ComprehensionRACE-m
Accuracy0.3466
31
Reading ComprehensionRACE
RACE Middle Score34.54
21
Reading ComprehensionRACE-h
Accuracy28.73
18
Language UnderstandingCEval
Accuracy33.81
17
Question AnsweringARC
ARCe Accuracy59.08
14
Showing 9 of 9 rows

Other info

Follow for update