Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Joint Optimization Framework for Learning with Noisy Labels

About

Deep neural networks (DNNs) trained on large-scale datasets have exhibited significant performance in image classification. Many large-scale datasets are collected from websites, however they tend to contain inaccurate labels that are termed as noisy labels. Training on such noisy labeled datasets causes performance degradation because DNNs easily overfit to noisy labels. To overcome this problem, we propose a joint optimization framework of learning DNN parameters and estimating true labels. Our framework can correct labels during training by alternating update of network parameters and labels. We conduct experiments on the noisy CIFAR-10 datasets and the Clothing1M dataset. The results indicate that our approach significantly outperforms other state-of-the-art methods.

Daiki Tanaka, Daiki Ikami, Toshihiko Yamasaki, Kiyoharu Aizawa• 2018

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 (test)
Accuracy72.94
3518
Image ClassificationCIFAR-10 (test)
Accuracy88.9
3381
Image ClassificationCIFAR-10 (test)
Accuracy93.5
906
Image ClassificationClothing1M (test)
Accuracy72.23
574
Image ClassificationImageNet (val)
Top-1 Accuracy59.5
354
Whole Slide Image classificationCAMELYON16 (test)
AUC0.9894
163
Image ClassificationCIFAR-100 non-IID (test)
Test Accuracy (Avg Best)59.84
113
Image ClassificationFood-101 (test)
Top-1 Acc81.5
89
Image ClassificationCIFAR-10 standard (test)
Accuracy88.37
81
Image ClassificationCIFAR-10 Symmetric Noise (test)
Test Accuracy (Overall)93.6
64
Showing 10 of 33 rows

Other info

Follow for update