Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

MaLA-500: Massive Language Adaptation of Large Language Models

About

Large language models (LLMs) have advanced the state of the art in natural language processing. However, their predominant design for English or a limited set of languages creates a substantial gap in their effectiveness for low-resource languages. To bridge this gap, we introduce MaLA-500, a novel large language model designed to cover an extensive range of 534 languages. To train MaLA-500, we employ vocabulary extension and continued pretraining on LLaMA 2 with Glot500-c. Our intrinsic evaluation demonstrates that MaLA-500 is better at predicting the given texts of low-resource languages than existing multilingual LLMs. Moreover, the extrinsic evaluation of in-context learning shows that MaLA-500 outperforms previous LLMs on SIB200 and Taxi1500 by a significant margin, i.e., 11.68% and 4.82% marco-average accuracy across languages. We release MaLA-500 at https://huggingface.co/MaLA-LM

Peiqin Lin, Shaoxiong Ji, J\"org Tiedemann, Andr\'e F. T. Martins, Hinrich Sch\"utze• 2024

Related benchmarks

TaskDatasetResultRank
Language ModelingFlores-200 (test)
Mean Perplexity84.7
12
Showing 1 of 1 rows

Other info

Follow for update