Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Graph Attention Transformer Network for Multi-Label Image Classification

About

Multi-label classification aims to recognize multiple objects or attributes from images. However, it is challenging to learn from proper label graphs to effectively characterize such inter-label correlations or dependencies. Current methods often use the co-occurrence probability of labels based on the training set as the adjacency matrix to model this correlation, which is greatly limited by the dataset and affects the model's generalization ability. In this paper, we propose a Graph Attention Transformer Network (GATN), a general framework for multi-label image classification that can effectively mine complex inter-label relationships. First, we use the cosine similarity based on the label word embedding as the initial correlation matrix, which can represent rich semantic information. Subsequently, we design the graph attention transformer layer to transfer this adjacency matrix to adapt to the current domain. Our extensive experiments have demonstrated that our proposed methods can achieve state-of-the-art performance on three datasets.

Jin Yuan, Shikai Chen, Yao Zhang, Zhongchao Shi, Xin Geng, Jianping Fan, Yong Rui• 2022

Related benchmarks

TaskDatasetResultRank
Multi-label image recognitionMS-COCO 2014 (val)
mAP89.3
51
Multi-Label ClassificationNUS-WIDE
mAP59.8
21
Image ClassificationPASCAL VOC 2007 (val)
mAP96.3
13
Showing 3 of 3 rows

Other info

Follow for update