Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Continual-learning for Modelling Low-Resource Languages from Large Language Models

About

Modelling a language model for a multi-lingual scenario includes several potential challenges, among which catastrophic forgetting is the major challenge. For example, small language models (SLM) built for low-resource languages by adapting large language models (LLMs) pose the challenge of catastrophic forgetting. This work proposes to employ a continual learning strategy using parts-of-speech (POS)-based code-switching along with a replay adapter strategy to mitigate the identified gap of catastrophic forgetting while training SLM from LLM. Experiments conducted on vision language tasks such as visual question answering and language modelling task exhibits the success of the proposed architecture.

Santosh Srinath K, Mudit Somani, Varun Reddy Padala, Prajna Devi Upadhyay, Abhijit Das• 2026

Related benchmarks

TaskDatasetResultRank
Semantic ParsingmTOP (test)
Average Score91.8
17
Question AnsweringPAXQA (test)
Average Accuracy79.36
12
Showing 2 of 2 rows

Other info

Follow for update