Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Attention Cube Network for Image Restoration

About

Recently, deep convolutional neural network (CNN) have been widely used in image restoration and obtained great success. However, most of existing methods are limited to local receptive field and equal treatment of different types of information. Besides, existing methods always use a multi-supervised method to aggregate different feature maps, which can not effectively aggregate hierarchical feature information. To address these issues, we propose an attention cube network (A-CubeNet) for image restoration for more powerful feature expression and feature correlation learning. Specifically, we design a novel attention mechanism from three dimensions, namely spatial dimension, channel-wise dimension and hierarchical dimension. The adaptive spatial attention branch (ASAB) and the adaptive channel attention branch (ACAB) constitute the adaptive dual attention module (ADAM), which can capture the long-range spatial and channel-wise contextual information to expand the receptive field and distinguish different types of information for more effective feature representations. Furthermore, the adaptive hierarchical attention module (AHAM) can capture the long-range hierarchical contextual information to flexibly aggregate different feature maps by weights depending on the global context. The ADAM and AHAM cooperate to form an "attention in attention" structure, which means AHAM's inputs are enhanced by ASAB and ACAB. Experiments demonstrate the superiority of our method over state-of-the-art image restoration methods in both quantitative comparison and visual analysis. Code is available at https://github.com/YCHang686/A-CubeNet.

Yucheng Hang, Qingmin Liao, Wenming Yang, Yupeng Chen, Jie Zhou• 2020

Related benchmarks

TaskDatasetResultRank
Image Super-resolutionUrban100 x4 (test)
PSNR26.27
90
Image Super-resolutionUrban100 x2 (test)
PSNR32.39
72
Image Super-resolutionUrban100 x3 (test)
PSNR28.38
58
Image Super-resolutionManga109 x2 (test)
PSNR38.88
52
Super-ResolutionManga109 x3 (test)
PSNR33.9
49
Image Super-resolutionB100 x4 (test)
PSNR27.65
45
Image Super-resolutionB100 x2 (test)
PSNR32.26
39
Image Super-resolutionSet5 x3 scale (test)
PSNR34.53
32
Image Super-resolutionSet14 x2 scale (test)
PSNR33.73
32
Image Super-resolutionB100 x3 (test)
PSNR29.17
29
Showing 10 of 15 rows

Other info

Follow for update