Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Re-Attention Transformer for Weakly Supervised Object Localization

About

Weakly supervised object localization is a challenging task which aims to localize objects with coarse annotations such as image categories. Existing deep network approaches are mainly based on class activation map, which focuses on highlighting discriminative local region while ignoring the full object. In addition, the emerging transformer-based techniques constantly put a lot of emphasis on the backdrop that impedes the ability to identify complete objects. To address these issues, we present a re-attention mechanism termed token refinement transformer (TRT) that captures the object-level semantics to guide the localization well. Specifically, TRT introduces a novel module named token priority scoring module (TPSM) to suppress the effects of background noise while focusing on the target object. Then, we incorporate the class activation map as the semantically aware input to restrain the attention map to the target object. Extensive experiments on two benchmarks showcase the superiority of our proposed method against existing methods with image category annotations. Source code is available in \url{https://github.com/su-hui-zz/ReAttentionTransformer}.

Hui Su, Yue Ye, Zhiwei Chen, Mingli Song, Lechao Cheng• 2022

Related benchmarks

TaskDatasetResultRank
Object LocalizationCUB v2
Max Box Acc V282
20
Showing 1 of 1 rows

Other info

Follow for update