Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Everybody Prune Now: Structured Pruning of LLMs with only Forward Passes

About

Structured pruning is a promising approach to create smaller, faster large language models. However, existing methods typically rely on computing the gradient via backward passes, which can inflate memory requirements and compute costs. In this work we introduce Bonsai, a gradient-free structured pruning method that eliminates the need for backpropagation, significantly reducing memory requirements and compute costs while achieving state-of-the-art pruning performance. Bonsai uses forward-pass-only perturbative pruning to enable efficient compression of large models on a broader range of hardware configurations. Unlike existing structured pruning approaches, Bonsai not only achieves better compression with fewer resources but also produces models that are twice as fast as those generated by semi-structured pruning. As a concrete demonstration, we use Bonsai to prune 7B and 8B models to 50% sparsity on a single A6000 GPU -- a task challenging for backprop-based methods in memory-constrained settings, as they require 2-3x the memory. Our results show that removing backprop as a requirement not only enables pruning larger models on constrained hardware but can also lead to state-of-the-art efficiency and performance.

Steven Kolawole, Lucio Dery, Jean-Fran\c{c}ois Kagy, Virginia Smith, Graham Neubig, Ameet Talwalkar• 2024

Related benchmarks

TaskDatasetResultRank
Language ModelingWikiText2
Perplexity20.43
1875
Commonsense ReasoningHellaSwag
Accuracy29
1460
Multi-task Language UnderstandingMMLU
Accuracy39.56
842
Commonsense ReasoningWinoGrande
Accuracy49
776
Question AnsweringARC Challenge
Accuracy18
749
Commonsense ReasoningPIQA
Accuracy59
647
Language ModelingWikiText
PPL33.23
479
Question AnsweringARC Easy
Accuracy47
386
Language ModelingWikiText2 v1 (test)
Perplexity80.89
341
Language ModelingWikiText2 (val)
Perplexity (PPL)80.89
277
Showing 10 of 19 rows

Other info

Follow for update