Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Route Experts by Sequence, not by Token

About

Mixture-of-Experts (MoE) architectures scale large language models (LLMs) by activating only a subset of experts per token, but the standard TopK routing assigns the same fixed number of experts to all tokens, ignoring their varying complexity. Prior adaptive routing methods introduce additional modules and hyperparameters, often requiring costly retraining from scratch. We propose Sequence-level TopK (SeqTopK), a minimal modification that shifts the expert budget from the token level to the sequence level. By selecting the top $T \cdot K$ experts across all $T$ tokens, SeqTopK enables end-to-end learned dynamic allocation -- assigning more experts to difficult tokens and fewer to easy ones -- while preserving the same overall budget. SeqTopK requires only a few lines of code, adds less than 1% overhead, and remains fully compatible with pretrained MoE models. Experiments across math, coding, law, and writing show consistent improvements over TopK and prior parameter-free adaptive methods, with gains that become substantially larger under higher sparsity (up to 16.9%). These results highlight SeqTopK as a simple, efficient, and scalable routing strategy, particularly well-suited for the extreme sparsity regimes of next-generation LLMs. Code is available at https://github.com/Y-Research-SBU/SeqTopK.

Tiansheng Wen, Yifei Wang, Aosong Feng, Long Ma, Xinyang Liu, Yifan Wang, Lixuan Guo, Bo Chen, Stefanie Jegelka, Chenyu You• 2025

Related benchmarks

TaskDatasetResultRank
Mathematical ReasoningGSM8K (test)
Accuracy55.87
770
Code GenerationHumanEval (test)
Pass@137.2
506
Code GenerationMBPP (test)
Pass@136.41
298
Legal ReasoningLaw
Score26.52
13
SummarizationSummary
Score46.4
13
Zero-shot EvaluationZero-shot Evaluation Suite (LAMBDA, RACE, ARC-E, ARC-C)
LAMBDA Score60.82
6
Legal ReasoningLaw (test)
Score45.29
5
SummarizationSummary (test)
Score41.31
5
Showing 8 of 8 rows

Other info

Follow for update