Share your thoughts, 1 month free Claude Pro on us
See more
Home
/
Benchmarks
Zero-shot Language Evaluation on DCLM Pro
Loading...
57.93
WinoGrande
PathMoE
56.0476
56.5363
57.025
57.5137
Mar 18, 2026
WinoGrande
PIQA
SocialIQA
CommonsenseQA
LAMBADA
HellaSwag
BoolQ
OpenBookQA
ARC-Easy
ARC-Challenge
TriviaQA
MMLU
Average Accuracy
Updated 1mo ago
Evaluation Results
Method
Method
Links
WinoGrande
PIQA
SocialIQA
CommonsenseQA
LAMBADA
HellaSwag
BoolQ
OpenBookQA
ARC-Easy
ARC-Challenge
TriviaQA
MMLU
Average Accuracy
PathMoE
Total Parameters=16.2B...
2026.03
57.93
71.38
40.07
48.73
53.64
63.32
64.01
39.6
68.52
40.1
21.18
35.64
50.34
Indep-MoE
Total Parameters=16.2B...
2026.03
56.12
68.99
40.58
43
51.97
63.22
63.06
35.8
63.43
37.29
21.91
35.27
48.39
Feedback
Search any
task
Search any
task