Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

EXACT: How to Train Your Accuracy

About

Classification tasks are usually evaluated in terms of accuracy. However, accuracy is discontinuous and cannot be directly optimized using gradient ascent. Popular methods minimize cross-entropy, hinge loss, or other surrogate losses, which can lead to suboptimal results. In this paper, we propose a new optimization framework by introducing stochasticity to a model's output and optimizing expected accuracy, i.e. accuracy of the stochastic model. Extensive experiments on linear models and deep image classification show that the proposed optimization method is a powerful alternative to widely used classification losses.

Ivan Karpukhin, Stanislav Dereka, Sergey Kolesnikov• 2022

Related benchmarks

TaskDatasetResultRank
Image ClassificationMNIST (test)
Accuracy99.67
882
Image ClassificationSVHN (test)
Accuracy97.79
51
Classificationdry-bean (test)
Accuracy92.99
39
ClassificationWINE (test)
Accuracy100
29
ClassificationAdult (test)
Min Test Accuracy85.21
24
Classificationcylinder-bands (test)
Accuracy77.22
13
ClassificationProtein (test)
Accuracy97.69
11
Classificationbreast-cancer-wisconsin (test)
Accuracy97.37
4
ClassificationCar (test)
Accuracy95.2
4
Classificationaudit-risk (test)
Accuracy96.67
4
Showing 10 of 12 rows

Other info

Code

Follow for update