Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

SOLAR: Communication-Efficient Model Adaptation via Subspace-Oriented Latent Adapter Reparametrization

About

Parameter-efficient fine-tuning (PEFT) methods, such as LoRA, enable scalable adaptation of foundation models by injecting low-rank adapters. However, their communication and storage costs remain a major bottleneck in resource-constrained settings. We propose SOLAR (Subspace-Oriented Latent Adapter Reparameterization), a post-training compression framework that substantially reduces the communication cost (i.e., the number of parameters to transmit or store) of PEFT adapters. SOLAR expresses each PEFT update as a linear combination of basis vectors formed from the foundation model's singular vectors with controlled random perturbations. By exploiting the subspace similarity (the alignment of principal directions) between the foundation model and task-specific fine-tuned updates, SOLAR decouples the adapter size from PEFT structure and ensures compact yet expressive representations. It is model-agnostic and compatible with existing PEFT methods, including LoRA, AdaLoRA, and other adapter modules. We theoretically establish a bound on the reconstruction error. Experiments on language and vision tasks using LLaMA, GPT, and ViT models demonstrate that SOLAR preserves task performance while significantly reducing model representation sizes, offering an effective and communication-efficient solution for deployment in distributed systems and edge devices.

Seyed Mahmoud Sajjadi Mohammadabadi, Xiaolong Ma, Lei Yang, Feng Yan, Junshan Zhang• 2026

Related benchmarks

TaskDatasetResultRank
Image ClassificationImageNet-1K
Top-1 Acc81.3
1239
Image ClassificationSUN397--
441
Image ClassificationCUB-200--
106
Image ClassificationOxford Pets
Top-1 Acc92.6
94
Language UnderstandingMMLU
MMLU Accuracy60.9
77
Multi-task Language UnderstandingMMLU
MMLU Accuracy54.5
59
Image ClassificationCIFAR-10 (full)
Top-1 Acc98.5
23
Natural language generationE2E
METEOR0.464
17
Image ClassificationCIFAR-10 10 samples per class
Top-1 Accuracy97
12
Image ClassificationCIFAR-100 10 samples/class
Top-1 Accuracy87.9
12
Showing 10 of 16 rows

Other info

Follow for update