Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Efficient Attention-Sharing Information Distillation Transformer for Lightweight Single Image Super-Resolution

About

Transformer-based Super-Resolution (SR) methods have demonstrated superior performance compared to convolutional neural network (CNN)-based SR approaches due to their capability to capture long-range dependencies. However, their high computational complexity necessitates the development of lightweight approaches for practical use. To address this challenge, we propose the Attention-Sharing Information Distillation (ASID) network, a lightweight SR network that integrates attention-sharing and an information distillation structure specifically designed for Transformer-based SR methods. We modify the information distillation scheme, originally designed for efficient CNN operations, to reduce the computational load of stacked self-attention layers, effectively addressing the efficiency bottleneck. Additionally, we introduce attention-sharing across blocks to further minimize the computational cost of self-attention operations. By combining these strategies, ASID achieves competitive performance with existing SR methods while requiring only around 300K parameters - significantly fewer than existing CNN-based and Transformer-based SR models. Furthermore, ASID outperforms state-of-the-art SR methods when the number of parameters is matched, demonstrating its efficiency and effectiveness. The code and supplementary material are available on the project page.

Karam Park, Jae Woong Soh, Nam Ik Cho• 2025

Related benchmarks

TaskDatasetResultRank
Image Super-resolutionUrban100 x4 (test)
PSNR26.89
282
Super-ResolutionUrban100 x2
PSNR33.46
104
Super-ResolutionUrban100 x4
PSNR27.07
103
Super-ResolutionManga109 4x
PSNR31.54
99
Super-ResolutionUrban100 x3
PSNR29.28
91
Image Super-resolutionUrban100 x2 (test)
PSNR33.35
91
Image Super-resolutionUrban100 x3 (test)
PSNR29.08
72
Super-ResolutionManga109 2x
PSNR39.54
71
Super-ResolutionSet14 2x
PSNR34.24
63
Image Super-resolutionB100 x4 (test)
PSNR27.78
59
Showing 10 of 27 rows

Other info

Follow for update