Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Complementary-Label Learning for Arbitrary Losses and Models

About

In contrast to the standard classification paradigm where the true class is given to each training pattern, complementary-label learning only uses training patterns each equipped with a complementary label, which only specifies one of the classes that the pattern does not belong to. The goal of this paper is to derive a novel framework of complementary-label learning with an unbiased estimator of the classification risk, for arbitrary losses and models---all existing methods have failed to achieve this goal. Not only is this beneficial for the learning stage, it also makes model/hyper-parameter selection (through cross-validation) possible without the need of any ordinarily labeled validation data, while using any linear/non-linear models or convex/non-convex loss functions. We further improve the risk estimator by a non-negative correction and gradient ascent trick, and demonstrate its superiority through experiments.

Takashi Ishida, Gang Niu, Aditya Krishna Menon, Masashi Sugiyama• 2018

Related benchmarks

TaskDatasetResultRank
Image ClassificationMNIST (test)
Accuracy88.1
882
Image ClassificationFashion MNIST (test)
Accuracy81.73
568
Image ClassificationCIFAR-10
Accuracy36.8
471
Image ClassificationMNIST
Accuracy88.1
395
Image ClassificationSVHN (test)
Accuracy17.56
362
ClassificationCIFAR10 (test)
Accuracy36.8
266
Image ClassificationFashion MNIST
Accuracy78.7
225
Time-series classificationPENDIGITS (test)
Accuracy15.01
36
ClassificationLETTER (test)
Accuracy5.12
33
Image ClassificationEMNIST Balanced (test)
Accuracy4.25
26
Showing 10 of 18 rows

Other info

Code

Follow for update