Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Learning to Learn from Noisy Labeled Data

About

Despite the success of deep neural networks (DNNs) in image classification tasks, the human-level performance relies on massive training data with high-quality manual annotations, which are expensive and time-consuming to collect. There exist many inexpensive data sources on the web, but they tend to contain inaccurate labels. Training on noisy labeled datasets causes performance degradation because DNNs can easily overfit to the label noise. To overcome this problem, we propose a noise-tolerant training algorithm, where a meta-learning update is performed prior to conventional gradient update. The proposed meta-learning method simulates actual training by generating synthetic noisy labels, and train the model such that after one gradient update using each set of synthetic noisy labels, the model does not overfit to the specific noise. We conduct extensive experiments on the noisy CIFAR-10 dataset and the Clothing1M dataset. The results demonstrate the advantageous performance of the proposed method compared to several state-of-the-art baselines.

Junnan Li, Yongkang Wong, Qi Zhao, Mohan Kankanhalli• 2018

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 (test)
Accuracy68.5
3518
Image ClassificationCIFAR-10 (test)
Accuracy96
3381
Image ClassificationCIFAR-10 (test)
Accuracy93.52
906
Image ClassificationClothing1M (test)
Accuracy73.5
546
Image ClassificationImageNet (val)
Top-1 Accuracy60.2
354
Image ClassificationFood-101 (test)
Top-1 Acc82.5
89
Image ClassificationCIFAR-100 Symmetric Noise (test)
Accuracy68.5
76
Image ClassificationCIFAR-100 (test)
Accuracy (Symmetric 20%)68.5
72
Image ClassificationCIFAR-10 synthetic noise (test)
Accuracy92.9
69
Image ClassificationCIFAR-10 Symmetric Noise (test)--
64
Showing 10 of 49 rows

Other info

Code

Follow for update