Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Symmetric Cross Entropy for Robust Learning with Noisy Labels

About

Training accurate deep neural networks (DNNs) in the presence of noisy labels is an important and challenging task. Though a number of approaches have been proposed for learning with noisy labels, many open issues remain. In this paper, we show that DNN learning with Cross Entropy (CE) exhibits overfitting to noisy labels on some classes ("easy" classes), but more surprisingly, it also suffers from significant under learning on some other classes ("hard" classes). Intuitively, CE requires an extra term to facilitate learning of hard classes, and more importantly, this term should be noise tolerant, so as to avoid overfitting to noisy labels. Inspired by the symmetric KL-divergence, we propose the approach of \textbf{Symmetric cross entropy Learning} (SL), boosting CE symmetrically with a noise robust counterpart Reverse Cross Entropy (RCE). Our proposed SL approach simultaneously addresses both the under learning and overfitting problem of CE in the presence of noisy labels. We provide a theoretical analysis of SL and also empirically show, on a range of benchmark and real-world datasets, that SL outperforms state-of-the-art methods. We also show that SL can be easily incorporated into existing methods in order to further enhance their performance.

Yisen Wang, Xingjun Ma, Zaiyi Chen, Yuan Luo, Jinfeng Yi, James Bailey• 2019

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 (test)
Accuracy72.44
3518
Image ClassificationCIFAR-10 (test)
Accuracy89.2
3381
Image ClassificationImageNet (val)
Top-1 Acc63.28
1206
Image ClassificationCIFAR-10 (test)
Accuracy91.3
906
Node ClassificationCora
Accuracy76.1
885
Image ClassificationMNIST (test)
Accuracy99.24
882
Node ClassificationCiteseer
Accuracy62.5
804
Node ClassificationPubmed
Accuracy72.1
742
Node ClassificationCora (test)
Mean Accuracy71.15
687
Image ClassificationCIFAR-100--
622
Showing 10 of 137 rows
...

Other info

Follow for update