FAAR: Format-Aware Adaptive Rounding for NVFP4
About
Deploying large language models (LLMs) on edge devices requires extremely low-bit quantization. Ultra-low precision formats such as NVFP4 offer a promising solution for reducing memory footprint and accelerating computation. However, existing quantization methods typically rely on conventional rounding strategies and fail to account for the non-uniformity of the NVFP4 numerical grid, resulting in suboptimal rounding decisions and amplified quantization errors. To address this, we propose Format-Aware Adaptive Rounding (FAAR), a learnable rounding strategy tailored for the NVFP4 format. Unlike conventional quantization paradigms, FAAR explicitly incorporates the non-uniform NVFP4 grid into the optimization process. By adaptively adjusting rounding decisions guided by loss gradients, our method effectively approximates the theoretically optimal quantization. To complement FAAR, we introduce a 2-stages Format Alignment (2FA) fine-tuning scheme that aligns LLM parameters layer-by-layer to the NVFP4 numerical space, further narrowing the performance gap. Remarkably, this learnable optimization incurs a minimal training overhead of only 4 GPU hours on Llama3-1B. Extensive experiments demonstrate the effectiveness of our approach. Compared with Round-to-Nearest (RTN), our method reduces perplexity on WikiText-2 from 14.28 to 12.60 on Llama3-1B and from 23.06 to 21.27 on Qwen3-1.7B. Additionally, our method consistently outperforms state-of-the-art approaches across various zero-shot downstream tasks.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Commonsense Reasoning | HellaSwag | Accuracy78.36 | 1891 | |
| Language Modeling | WikiText-2 | Perplexity (PPL)8.13 | 1624 | |
| Question Answering | BoolQ | -- | 317 | |
| Multiple-choice Question Answering | ARC Easy | Accuracy79.02 | 188 | |
| Multiple-choice Question Answering | ARC Challenge | Acc53.95 | 118 | |
| Language Modeling | C4 | Word Perplexity20.17 | 32 | |
| Feature Space Preservation | WikiText-2 | Cosine Similarity99.13 | 32 | |
| Feature Space Preservation | C4 | Cosine Similarity98.71 | 32 |