Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Probabilistic End-to-end Noise Correction for Learning with Noisy Labels

About

Deep learning has achieved excellent performance in various computer vision tasks, but requires a lot of training examples with clean labels. It is easy to collect a dataset with noisy labels, but such noise makes networks overfit seriously and accuracies drop dramatically. To address this problem, we propose an end-to-end framework called PENCIL, which can update both network parameters and label estimations as label distributions. PENCIL is independent of the backbone network structure and does not need an auxiliary clean dataset or prior information about noise, thus it is more general and robust than existing methods and is easy to apply. PENCIL outperforms previous state-of-the-art methods by large margins on both synthetic and real-world datasets with different noise types and noise rates. Experiments show that PENCIL is robust on clean datasets, too.

Kun Yi, Jianxin Wu• 2019

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 (test)
Accuracy69.4
3518
Image ClassificationCIFAR-10 (test)
Accuracy92.4
3381
Image ClassificationCIFAR-10 (test)
Accuracy93.04
906
Image ClassificationCIFAR-100--
622
Image ClassificationClothing1M (test)
Accuracy73.5
546
Image ClassificationCIFAR-10
Accuracy92.4
471
Image ClassificationImageNet (val)
Top-1 Accuracy60.8
354
Image ClassificationFood-101 (test)
Top-1 Acc83.1
89
Image ClassificationCIFAR-100 Symmetric Noise (test)
Accuracy69.4
76
Image ClassificationCIFAR-100 (test)
Accuracy (Symmetric 20%)69.4
72
Showing 10 of 48 rows

Other info

Follow for update