Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Learning from Complementary Labels

About

Collecting labeled data is costly and thus a critical bottleneck in real-world classification tasks. To mitigate this problem, we propose a novel setting, namely learning from complementary labels for multi-class classification. A complementary label specifies a class that a pattern does not belong to. Collecting complementary labels would be less laborious than collecting ordinary labels, since users do not have to carefully choose the correct class from a long list of candidate classes. However, complementary labels are less informative than ordinary labels and thus a suitable approach is needed to better learn from them. In this paper, we show that an unbiased estimator to the classification risk can be obtained only from complementarily labeled data, if a loss function satisfies a particular symmetric condition. We derive estimation error bounds for the proposed method and prove that the optimal parametric convergence rate is achieved. We further show that learning from complementary labels can be easily combined with learning from ordinary labels (i.e., ordinary supervised learning), providing a highly practical implementation of the proposed method. Finally, we experimentally demonstrate the usefulness of the proposed methods.

Takashi Ishida, Gang Niu, Weihua Hu, Masashi Sugiyama• 2017

Related benchmarks

TaskDatasetResultRank
Image ClassificationMNIST (test)
Accuracy80.2
882
Image ClassificationFashion MNIST (test)
Accuracy77.34
568
Image ClassificationCIFAR-10
Accuracy33.4
471
Image ClassificationMNIST
Accuracy80.2
395
Image ClassificationSVHN (test)
Accuracy20.74
362
ClassificationCIFAR10 (test)
Accuracy33.4
266
Image ClassificationFashion MNIST
Accuracy75.7
225
Time-series classificationPENDIGITS (test)
Accuracy62.98
36
ClassificationLETTER (test)
Accuracy9.17
33
Image ClassificationEMNIST Balanced (test)
Accuracy14.28
26
Showing 10 of 14 rows

Other info

Follow for update