Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Rank-Factorized Implicit Neural Bias: Scaling Super-Resolution Transformer with FlashAttention

About

Recent Super-Resolution~(SR) methods mainly adopt Transformers for their strong long-range modeling capability and exceptional representational capacity. However, most SR Transformers rely heavily on relative positional bias~(RPB), which prevents them from leveraging hardware-efficient attention kernels such as FlashAttention. This limitation imposes a prohibitive computational burden during both training and inference, severely restricting attempts to scale SR Transformers by enlarging the training patch size or the self-attention window. Consequently, unlike other domains that actively exploit the inherent scalability of Transformers, SR Transformers remain heavily focused on effectively utilizing limited receptive fields. In this paper, we propose Rank-factorized Implicit Neural Bias~(RIB), an alternative to RPB that enables FlashAttention in SR Transformers. Specifically, RIB approximates positional bias using low-rank implicit neural representations and concatenates them with pixel content tokens in a channel-wise manner, turning the element-wise bias addition in attention score computation into a dot-product operation. Further, we introduce a convolutional local attention and a cyclic window strategy to fully leverage the advantages of long-range interactions enabled by RIB and FlashAttention. We enlarge the window size up to \textbf{96$\times$96} while jointly scaling the training patch size and the dataset size, maximizing the benefits of Transformers in the SR task. As a result, our network achieves \textbf{35.63\,dB PSNR} on Urban100$\times$2, while reducing training and inference time by \textbf{2.1$\times$} and \textbf{2.9$\times$}, respectively, compared to the RPB-based SR Transformer~(PFT).

Dongheon Lee, Seokju Yun, Jaegyun Im, Youngmin Ro• 2026

Related benchmarks

TaskDatasetResultRank
Super-ResolutionSet5
PSNR38.78
785
Super-ResolutionSet14
PSNR35.28
613
Image Super-resolutionSet5 (test)
PSNR38.42
566
Super-ResolutionB100 (test)
PSNR32.53
381
Super-ResolutionManga109
PSNR40.86
330
Super-ResolutionBSD100
PSNR32.73
329
Image Super-resolutionSet14 (test)
PSNR34.62
314
Single Image Super-ResolutionUrban100 (test)
PSNR34.31
311
Image Super-resolutionUrban100 x4 (test)
PSNR28.39
282
Image Super-resolutionManga109 (test)
PSNR39.9
255
Showing 10 of 28 rows

Other info

Follow for update