Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

AA-SVD : Anchored and Adaptive SVD for Large Language Model Compression

About

We introduce a fast low-rank factorization-based framework for compressing large language models that enables rapid compression of billion-parameter models without retraining. Unlike existing factorization-based approaches that optimize only on the original inputs, ignoring distribution shifts from upstream compression and thus propagating errors forward, or those that rely only on shifted inputs and risk drifting away from the original outputs, our approach accounts for both. Beyond individual layer compression, we further refine each transformer block end-to-end, minimizing block-level output distortion and allowing compressed layers to jointly compensate for accumulated errors. By anchoring each compressed layer to the original outputs while explicitly modeling input distribution shifts, our method finds a low-rank approximation that maintains functional equivalence with the original model. Experiments on large language models show that our method consistently outperforms existing SVD-based baselines across compression ratios, with the advantage becoming increasingly pronounced at aggressive compression budgets, where competing methods degrade substantially or collapse entirely, offering a practical solution for efficient, large-scale model deployment.

Atul Kumar Sinha, Fran\c{c}ois Fleuret• 2026

Related benchmarks

TaskDatasetResultRank
Language ModelingWikiText2
Perplexity5.95
2839
Language ModelingC4
Perplexity8.37
1071
Language ModelingPTB
Perplexity8.97
1034
Language ModelingWiki2
PPL5.95
149
Commonsense ReasoningAverage 7 Commonsense Reasoning Tasks
Avg Accuracy53
72
Commonsense ReasoningCommonsense Reasoning Suite
OpenBookQA Accuracy33
48
Language ModelingWikiText-2, PTB, C4
WikiText-2 Perplexity15.12
38
Commonsense ReasoningCommonsense Reasoning (OpenBookQA, ARC-E, ARC-C, WinoGrande, PIQA, MathQA, HellaSwag)
OpenBookQA31
7
Showing 8 of 8 rows

Other info

Follow for update