Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Accurate Image Restoration with Attention Retractable Transformer

About

Recently, Transformer-based image restoration networks have achieved promising improvements over convolutional neural networks due to parameter-independent global interactions. To lower computational cost, existing works generally limit self-attention computation within non-overlapping windows. However, each group of tokens are always from a dense area of the image. This is considered as a dense attention strategy since the interactions of tokens are restrained in dense regions. Obviously, this strategy could result in restricted receptive fields. To address this issue, we propose Attention Retractable Transformer (ART) for image restoration, which presents both dense and sparse attention modules in the network. The sparse attention module allows tokens from sparse areas to interact and thus provides a wider receptive field. Furthermore, the alternating application of dense and sparse attention modules greatly enhances representation ability of Transformer while providing retractable attention on the input image.We conduct extensive experiments on image super-resolution, denoising, and JPEG compression artifact reduction tasks. Experimental results validate that our proposed ART outperforms state-of-the-art methods on various benchmark datasets both quantitatively and visually. We also provide code and models at https://github.com/gladzhang/ART.

Jiale Zhang, Yulun Zhang, Jinjin Gu, Yongbing Zhang, Linghe Kong, Xin Yuan• 2022

Related benchmarks

TaskDatasetResultRank
Super-ResolutionSet5
PSNR38.56
751
Image Super-resolutionManga109
PSNR40.24
656
Super-ResolutionUrban100
PSNR34.3
603
Super-ResolutionSet14
PSNR34.59
586
Super-ResolutionBSD100
PSNR32.58
313
Super-ResolutionSet14 4x (test)
PSNR29.16
117
Super-ResolutionSet5 x2 (test)
PSNR38.56
95
Image Super-resolutionUrban100 x4 (test)
PSNR27.77
90
Super-ResolutionManga109 4x
PSNR32.31
88
Super-ResolutionSet5 3 (test)
PSNR (dB)35.07
87
Showing 10 of 44 rows

Other info

Follow for update