Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Scaling Model and Data for Multilingual Machine Translation with Open Large Language Models

About

Open large language models (LLMs) have demonstrated improving multilingual capabilities in recent years. In this paper, we present a study of open LLMs for multilingual machine translation (MT) across a range of languages, and investigate the effects of model scaling and data scaling when adapting open LLMs to multilingual MT through continual pretraining and instruction finetuning. Based on the Gemma3 model family, we develop MiLMMT-46, which achieves top-tier multilingual translation performance across 46 languages. Extensive experiments show that MiLMMT-46 consistently outperforms recent state-of-the-art (SOTA) models, including Seed-X, HY-MT-1.5, and TranslateGemma, and achieves competitive performance with strong proprietary systems such as Google Translate and Gemini 3 Pro. Models are released at https://huggingface.co/collections/xiaomi-research/milmmt-46. Codes are released at https://github.com/xiaomi-research/gemmax.

Yuzhe Shang, Pengzhi Gao, Wei Liu, Jian Luan, Jinsong Su• 2026

Related benchmarks

TaskDatasetResultRank
Machine TranslationFLORES+ (test)
spBLEU57.64
128
Multilingual Machine TranslationFLORES+ (devtest)
spBLEU44.34
63
Machine TranslationWMT24++ v1.0 (test)
XCOMET Score90
49
Machine Translation (xx -> zh)FLORES+ latest (test)
spBLEU33.16
30
Showing 4 of 4 rows

Other info

Follow for update