Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Asymmetric Loss Functions for Learning with Noisy Labels

About

Robust loss functions are essential for training deep neural networks with better generalization power in the presence of noisy labels. Symmetric loss functions are confirmed to be robust to label noise. However, the symmetric condition is overly restrictive. In this work, we propose a new class of loss functions, namely \textit{asymmetric loss functions}, which are robust to learning with noisy labels for various types of noise. We investigate general theoretical properties of asymmetric loss functions, including classification calibration, excess risk bound, and noise tolerance. Meanwhile, we introduce the asymmetry ratio to measure the asymmetry of a loss function. The empirical results show that a higher ratio would provide better noise tolerance. Moreover, we modify several commonly-used loss functions and establish the necessary and sufficient conditions for them to be asymmetric. Experimental results on benchmark datasets demonstrate that asymmetric loss functions can outperform state-of-the-art methods. The code is available at \href{https://github.com/hitcszx/ALFs}{https://github.com/hitcszx/ALFs}

Xiong Zhou, Xianming Liu, Junjun Jiang, Xin Gao, Xiangyang Ji• 2021

Related benchmarks

TaskDatasetResultRank
Image ClassificationImageNet (val)
Top-1 Acc62.68
1206
Image ClassificationClothing1M (test)
Accuracy67.52
546
Image ClassificationWebVision 1.0 (val)
Top-1 Acc66
59
RetrievalMS Marco
Recall50.6
20
Image ClassificationCIFAR-100 instance-dependent noise (IDN) (test)
Acc (η=0.2)65.33
18
Image ClassificationCIFAR-10 instance-dependent noise (IDN) (test)
Accuracy (η=0.2)88.9
18
RetrievalLCQuAD
Recall89.6
8
Showing 7 of 7 rows

Other info

Follow for update