Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

DIETA: A Decoder-only transformer-based model for Italian-English machine TrAnslation

About

In this paper, we present DIETA, a small, decoder-only Transformer model with 0.5 billion parameters, specifically designed and trained for Italian-English machine translation. We collect and curate a large parallel corpus consisting of approximately 207 million Italian-English sentence pairs across diverse domains, including parliamentary proceedings, legal texts, web-crawled content, subtitles, news, literature and 352 million back-translated data using pretrained models. Additionally, we create and release a new small-scale evaluation set, consisting of 450 sentences, based on 2025 WikiNews articles, enabling assessment of translation quality on contemporary text. Comprehensive evaluations show that DIETA achieves competitive performance on multiple Italian-English benchmarks, consistently ranking in the second quartile of a 32-system leaderboard and outperforming most other sub-3B models on four out of five test suites. The training script, trained models, curated corpus, and newly introduced evaluation set are made publicly available, facilitating further research and development in specialized Italian-English machine translation. https://github.com/pkasela/DIETA-Machine-Translation

Pranav Kasela, Marco Braga, Alessandro Ghiotto, Andrea Pilzer, Marco Viviani, Alessandro Raganato• 2026

Related benchmarks

TaskDatasetResultRank
Machine TranslationNTREX it->en 128 (test)
sacreBLEU42.7624
35
Machine TranslationWikinews-25 it->en
sacreBLEU43.8153
35
TranslationFLORES-200 en-it (devtest)
sacreBLEU30.4376
35
Machine TranslationNTREX (en->it) 128 (test)
sacreBLEU36.3722
35
Machine TranslationWikinews-25 en->it
sacreBLEU46.0306
35
TranslationFLORES-200 it-en (devtest)
sacreBLEU33.3923
35
Machine TranslationTatoeba en->it
sacreBLEU58.5519
33
Machine TranslationTatoeba it->en
sacreBLEU70.022
33
Showing 8 of 8 rows

Other info

Follow for update