Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels

About

Deep learning with noisy labels is practically challenging, as the capacity of deep models is so high that they can totally memorize these noisy labels sooner or later during training. Nonetheless, recent studies on the memorization effects of deep neural networks show that they would first memorize training data of clean labels and then those of noisy labels. Therefore in this paper, we propose a new deep learning paradigm called Co-teaching for combating with noisy labels. Namely, we train two deep neural networks simultaneously, and let them teach each other given every mini-batch: firstly, each network feeds forward all data and selects some data of possibly clean labels; secondly, two networks communicate with each other what data in this mini-batch should be used for training; finally, each network back propagates the data selected by its peer network and updates itself. Empirical results on noisy versions of MNIST, CIFAR-10 and CIFAR-100 demonstrate that Co-teaching is much superior to the state-of-the-art methods in the robustness of trained deep models.

Bo Han, Quanming Yao, Xingrui Yu, Gang Niu, Miao Xu, Weihua Hu, Ivor Tsang, Masashi Sugiyama• 2018

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 (test)
Accuracy73.39
3518
Image ClassificationCIFAR-10 (test)
Accuracy91.22
3381
Image ClassificationCIFAR-10 (test)
Accuracy94.64
906
Node ClassificationCora
Accuracy66.7
885
Image ClassificationMNIST (test)
Accuracy97.25
882
Node ClassificationCiteseer
Accuracy50.9
804
Node ClassificationPubmed
Accuracy68.9
742
Node ClassificationCora (test)
Mean Accuracy62.09
687
Image ClassificationFashion MNIST (test)
Accuracy91.48
568
Image ClassificationClothing1M (test)
Accuracy71.68
546
Showing 10 of 273 rows
...

Other info

Code

Follow for update