Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

ASVD: Activation-aware Singular Value Decomposition for Compressing Large Language Models

About

In this paper, we introduce a new post-training compression paradigm for Large Language Models (LLMs) to facilitate their wider adoption. We delve into LLM weight low-rank decomposition, and find that the challenges of this task stem from (1) the distribution variance in the LLM activations and (2) the sensitivity difference among various kinds of layers. To address these issues, we propose a training-free approach called Activation-aware Singular Value Decomposition (ASVD). Specifically, ASVD manages activation outliers by transforming the weight matrix based on the activation distribution. This transformation allows the outliers in the activation matrix to be absorbed into the transformed weight matrix, thereby enhancing decomposition accuracy. Additionally, we propose an efficient iterative calibration process to optimize layer-specific decomposition by addressing the varying sensitivity of different LLM layers. In this way, ASVD can compress a network by 10%-30%. Based on the success of the low-rank decomposition of projection matrices in the self-attention module, we further introduce ASVD to compress the KV cache. By reducing the channel dimension of KV activations, memory requirements for KV cache can be largely reduced. ASVD can further achieve 50% KV cache reductions without performance drop in a training-free manner.

Zhihang Yuan, Yuzhang Shang, Yue Song, Dawei Yang, Qiang Wu, Yan Yan, Guangyu Sun• 2023

Related benchmarks

TaskDatasetResultRank
Language ModelingWikiText2
Perplexity11.14
1875
Language ModelingWikiText-2 (test)
PPL6.74
1541
Language ModelingC4
Perplexity7.66
1182
Mathematical ReasoningGSM8K
Accuracy44
983
Code GenerationHumanEval
Pass@141
850
Language ModelingWikiText-2
Perplexity (PPL)6.54
841
Language ModelingPTB
Perplexity16.55
650
Language ModelingC4 (test)
Perplexity15.93
268
ReasoningARC Easy
Accuracy70
183
ReasoningHellaSwag (HS)
HellaSwag Accuracy50
142
Showing 10 of 25 rows

Other info

Follow for update