Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

SQ-format: A Unified Sparse-Quantized Hardware-friendly Data Format for LLMs

About

Post-training quantization (PTQ) plays a crucial role in the democratization of large language models (LLMs). However, existing low-bit quantization and sparsification techniques are difficult to balance accuracy and efficiency due to the limited hardware support. For example, W4A8 can only achieve the same peak TOPS as W8A8 whereas the GPU-supported sparse data format (2:4 semi-structure sparse) is seldomly adopted due to the loss of accuracy. To bridge this gap, in this paper, we propose the Sparse-Quantized Format (SQ-format), which is a unified data format for quantization and sparsification potentially easily supported by new hardware and existing GPUs. SQ-format makes use of the fact that sparse matrix can be accelerated in high-precision, and low-precision matrix multiplication can also be accelerated accordingly. As such, SQ-format is proposed to achieve Pareto improvement between performance and throughput. This format is particularly suitable for activations with outlier inequality status and makes their static compression possible. We show the state-of-the-art PTQ performance with SQ-format, propose the hardware required to support it, and further offer the design exploration and insights for the next-generation AI accelerators.

Ruixuan Huang, Hao Zeng, Hantao Huang, Jinyuan Shi, Minghui Yu, Ian En-Hsu Yen, Shuai Wang• 2025

Related benchmarks

TaskDatasetResultRank
Commonsense ReasoningHellaSwag
Accuracy87.19
1891
Question AnsweringARC Challenge
Accuracy63.99
906
Language ModelingWikiText
PPL3.31
732
Physical Commonsense ReasoningPIQA
Accuracy85.31
572
Question AnsweringARC Easy
Normalized Acc85.52
389
Commonsense ReasoningWinoGrande
Accuracy79.45
372
Mathematical ReasoningGSM8K
Accuracy96.21
312
Language ModelingLAMBADA
Perplexity12.77
150
Multiple-choice Question AnsweringARC Challenge
Non-generative Accuracy0.628
36
General EvaluationAGIEval
Accuracy69.58
29
Showing 10 of 12 rows

Other info

Follow for update