Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Faster Meta Update Strategy for Noise-Robust Deep Learning

About

It has been shown that deep neural networks are prone to overfitting on biased training data. Towards addressing this issue, meta-learning employs a meta model for correcting the training bias. Despite the promising performances, super slow training is currently the bottleneck in the meta learning approaches. In this paper, we introduce a novel Faster Meta Update Strategy (FaMUS) to replace the most expensive step in the meta gradient computation with a faster layer-wise approximation. We empirically find that FaMUS yields not only a reasonably accurate but also a low-variance approximation of the meta gradient. We conduct extensive experiments to verify the proposed method on two tasks. We show our method is able to save two-thirds of the training time while still maintaining the comparable or achieving even better generalization performance. In particular, our method achieves the state-of-the-art performance on both synthetic and realistic noisy labels, and obtains promising performance on long-tailed recognition on standard benchmarks.

Youjiang Xu, Linchao Zhu, Lu Jiang, Yi Yang• 2021

Related benchmarks

TaskDatasetResultRank
Image ClassificationClothing1M (test)
Accuracy74.43
546
Image ClassificationImageNet ILSVRC-2012 (val)
Top-1 Accuracy77
405
Image ClassificationImageNet (val)
Top-1 Accuracy77
354
Image ClassificationCIFAR-10 long-tailed (test)--
201
Image ClassificationILSVRC 2012 (test)
Top-1 Acc77
117
Image ClassificationCIFAR-100 (test)--
72
Image ClassificationWebvision (test)
Acc79.4
57
Image ClassificationRed Mini-ImageNet (test)
Accuracy51.42
51
Image ClassificationCIFAR100-LT (test)
Top-1 Acc (IR=100)46.03
45
Image ClassificationWebVision (val)
Top-1 Acc79.4
40
Showing 10 of 31 rows

Other info

Code

Follow for update