Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Attend and Rectify: a Gated Attention Mechanism for Fine-Grained Recovery

About

We propose a novel attention mechanism to enhance Convolutional Neural Networks for fine-grained recognition. It learns to attend to lower-level feature activations without requiring part annotations and uses these activations to update and rectify the output likelihood distribution. In contrast to other approaches, the proposed mechanism is modular, architecture-independent and efficient both in terms of parameters and computation required. Experiments show that networks augmented with our approach systematically improve their classification accuracy and become more robust to clutter. As a result, Wide Residual Networks augmented with our proposal surpasses the state of the art classification accuracies in CIFAR-10, the Adience gender recognition task, Stanford dogs, and UEC Food-100.

Pau Rodr\'iguez, Josep M. Gonfaus, Guillem Cucurull, F. Xavier Roca, Jordi Gonz\`alez• 2018

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 (val)--
661
Image ClassificationCIFAR-100--
622
Fine-grained Image ClassificationCUB200 2011 (test)
Accuracy85.6
536
Image ClassificationCIFAR-10--
471
Fine-grained Image ClassificationStanford Cars (test)
Accuracy90
348
Image ClassificationCIFAR-10 (val)--
329
Fine-grained Image ClassificationStanford Dogs (test)
Accuracy92.9
117
Fine-grained Image ClassificationUEC Food-100 (test)
Accuracy85.5
3
Fine-grained Image ClassificationAdience Gender (test)
Accuracy94.6
3
Fine-grained Image ClassificationAdience Age (test)
Accuracy59.7
3
Showing 10 of 10 rows

Other info

Code

Follow for update