Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Tiny Aya: Bridging Scale and Multilingual Depth

About

Tiny Aya redefines what a small multilingual language model can achieve. Trained on 70 languages and refined through region-aware posttraining, it delivers state-of-the-art in translation quality, strong multilingual understanding, and high-quality target-language generation, all with just 3.35B parameters. The release includes a pretrained foundation model, a globally balanced instruction-tuned variant, and three region-specialized models targeting languages from Africa, South Asia, Europe, Asia-Pacific, and West Asia. This report details the training strategy, data composition, and comprehensive evaluation framework behind Tiny Aya, and presents an alternative scaling path for multilingual AI: one centered on efficiency, balanced performance across languages, and practical deployment.

Alejandro R. Salamanca, Diana Abagyan, Daniel D'souza, Ammar Khairi, David Mora, Saurabh Dash, Viraat Aryabumi, Sara Rajaee, Mehrnaz Mofakhami, Ananya Sahu, Thomas Euyang, Brittawnya Prince, Madeline Smith, Hangyu Lin, Acyr Locatelli, Sara Hooker, Tom Kocmi, Aidan Gomez, Ivan Zhang, Phil Blunsom, Nick Frosst, Joelle Pineau, Beyza Ermis, Ahmet \"Ust\"un, Julia Kreutzer, Marzieh Fadaee• 2026

Related benchmarks

TaskDatasetResultRank
Safety EvaluationMultiJail
Safe Response Rate95
66
Short Question AnsweringBLEnD Short Question Answer
Average Accuracy41
18
Machine TranslationWMT 24++
Standard Deviation of Score10
16
Machine TranslationBOUQuET XX-En
ChrF++ (high)62.7
16
TranslationFLoRes+ XX-En, high resource level
ChrF++61.3
16
TranslationFLoRes+ En-YY mid resource level
ChrF++27.6
16
Machine TranslationBOUQuET En-YY
ChrF++ (high)58.5
16
TranslationFLoRes+ En-YY, high resource level
ChrF++53.6
16
TranslationFLoRes+ XX-En mid resource level
ChrF++39.9
16
TranslationFLoRes+ XX-En total
ChrF++40.4
16
Showing 10 of 35 rows

Other info

Follow for update