Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

CompSRT: Quantization and Pruning for Image Super Resolution Transformers

About

Model compression has become an important tool for making image super resolution models more efficient. However, the gap between the best compressed models and the full precision model still remains large and a need for deeper understanding of compression theory on more performant models remains. Prior research on quantization of LLMs has shown that Hadamard transformations lead to weights and activations with reduced outliers, which leads to improved performance. We argue that while the Hadamard transform does reduce the effect of outliers, an empirical analysis on how the transform functions remains needed. By studying the distributions of weights and activations of SwinIR-light, we show with statistical analysis that lower errors is caused by the Hadamard transforms ability to reduce the ranges, and increase the proportion of values around $0$. Based on these findings, we introduce CompSRT, a more performant way to compress the image super resolution transformer network SwinIR-light. We perform Hadamard-based quantization, and we also perform scalar decomposition to introduce two additional trainable parameters. Our quantization performance statistically significantly surpasses the SOTA in metrics with gains as large as 1.53 dB, and visibly improves visual quality by reducing blurriness at all bitwidths. At $3$-$4$ bits, to show our method is compatible with pruning for increased compression, we also prune $40\%$ of weights and show that we can achieve $6.67$-$15\%$ reduction in bits per parameter with comparable performance to SOTA.

Dorsa Zeinali, Hailing Wang, Yitian Zhang, Yun Fu• 2026

Related benchmarks

TaskDatasetResultRank
Image Super-resolutionManga109
PSNR39.05
656
Image Super-resolutionSet5
PSNR38.13
507
Single Image Super-ResolutionUrban100
PSNR32.57
500
Single Image Super-ResolutionSet5
PSNR38.15
352
Image Super-resolutionSet14
PSNR33.9
329
Image Super-resolutionUrban100
PSNR32.8
221
Image Super-resolutionB100
PSNR32.28
51
Image Super-resolutionB100
PSNR32.29
24
Showing 8 of 8 rows

Other info

Follow for update