Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Attention on Attention for Image Captioning

About

Attention mechanisms are widely used in current encoder/decoder frameworks of image captioning, where a weighted average on encoded vectors is generated at each time step to guide the caption decoding process. However, the decoder has little idea of whether or how well the attended vector and the given attention query are related, which could make the decoder give misled results. In this paper, we propose an Attention on Attention (AoA) module, which extends the conventional attention mechanisms to determine the relevance between attention results and queries. AoA first generates an information vector and an attention gate using the attention result and the current context, then adds another attention by applying element-wise multiplication to them and finally obtains the attended information, the expected useful knowledge. We apply AoA to both the encoder and the decoder of our image captioning model, which we name as AoA Network (AoANet). Experiments show that AoANet outperforms all previously published methods and achieves a new state-of-the-art performance of 129.8 CIDEr-D score on MS COCO Karpathy offline test split and 129.6 CIDEr-D (C40) score on the official online testing server. Code is available at https://github.com/husthuaan/AoANet.

Lun Huang, Wenmin Wang, Jie Chen, Xiao-Yong Wei• 2019

Related benchmarks

TaskDatasetResultRank
Image CaptioningMS COCO Karpathy (test)
CIDEr1.32
682
Image CaptioningMS-COCO (test)
CIDEr129.8
117
Image CaptioningMS COCO (Karpathy)
CIDEr-D129.8
56
Image CaptioningTextCaps (val)
CIDEr42.7
51
Image CaptioningTextCaps (test)
CIDEr34.6
50
Image CaptioningMS-COCO online (test)
BLEU-4 (c5)39.4
49
Image CaptioningMS-COCO 2014 (test)
BLEU-471.2
43
Image CaptioningConceptual Captions (test)
CIDEr39
34
Image CaptioningMS-COCO Karpathy 2014 (test)
BLEU-438.9
24
Image CaptioningCOCO c5 references online (test)
BLEU-181
24
Showing 10 of 16 rows

Other info

Follow for update