Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Reduction of Class Activation Uncertainty with Background Information

About

Multitask learning is a popular approach to training high-performing neural networks with improved generalization. In this paper, we propose a background class to achieve improved generalization at a lower computation compared to multitask learning to help researchers and organizations with limited computation power. We also present a methodology for selecting background images and discuss potential future improvements. We apply our approach to several datasets and achieve improved generalization with much lower computation. Through the class activation mappings (CAMs) of the trained models, we observed the tendency towards looking at a bigger picture with the proposed model training methodology. Applying the vision transformer with the proposed background class, we receive state-of-the-art (SOTA) performance on CIFAR-10C, Caltech-101, and CINIC-10 datasets. Example scripts are available in the `CAM' folder of the following GitHub Repository: github.com/dipuk0506/UQ

H M Dipu Kabir• 2023

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-10 (test)
Accuracy98.98
3381
Image ClassificationSTL-10 (test)
Accuracy99.71
357
Image ClassificationCINIC-10 (test)
Accuracy95.8
177
Image ClassificationOxford Flowers-102 (test)
Top-1 Accuracy99.75
131
Image ClassificationCaltech101 (test)
Accuracy98.02
121
Image ClassificationCIFAR-10C
Brightness Acc99.03
11
Showing 6 of 6 rows

Other info

Code

Follow for update