Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Cross-Lingual Activation Steering for Multilingual Language Models

About

Large language models exhibit strong multilingual capabilities, yet significant performance gaps persist between dominant and non-dominant languages. Prior work attributes this gap to imbalances between shared and language-specific neurons in multilingual representations. We propose Cross-Lingual Activation Steering (CLAS), a training-free inference-time intervention that selectively modulates neuron activations. We evaluate CLAS on classification and generation benchmarks, achieving average improvements of 2.3% (Acc.) and 3.4% (F1) respectively, while maintaining high-resource language performance. We discover that effective transfer operates through functional divergence rather than strict alignment; performance gains correlate with increased language cluster separation. Our results demonstrate that targeted activation steering can unlock latent multilingual capacity in existing models without modification to model weights.

Rhitabrat Pokharel, Ameeta Agrawal, Tanay Nagar• 2026

Related benchmarks

TaskDatasetResultRank
Natural Language InferenceXNLI
Accuracy59.93
111
Question AnsweringXQuAD 1.0 (test)--
10
Showing 2 of 2 rows

Other info

Follow for update