Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Does label smoothing mitigate label noise?

About

Label smoothing is commonly used in training deep learning models, wherein one-hot training labels are mixed with uniform label vectors. Empirically, smoothing has been shown to improve both predictive performance and model calibration. In this paper, we study whether label smoothing is also effective as a means of coping with label noise. While label smoothing apparently amplifies this problem --- being equivalent to injecting symmetric noise to the labels --- we show how it relates to a general family of loss-correction techniques from the label noise literature. Building on this connection, we show that label smoothing is competitive with loss-correction under label noise. Further, we show that when distilling models from noisy data, label smoothing of the teacher is beneficial; this is in contrast to recent findings for noise-free problems, and sheds further light on settings where label smoothing is beneficial.

Michal Lukasik, Srinadh Bhojanapalli, Aditya Krishna Menon, Sanjiv Kumar• 2020

Related benchmarks

TaskDatasetResultRank
Image ClassificationClothing1M (test)
Accuracy73.44
574
Fine-grained Image ClassificationCUB200 2011 (test)
Accuracy68.78
543
Fine-grained Image ClassificationStanford Cars (test)
Accuracy74.28
348
Fine-grained Image ClassificationStanford Dogs (test)
Accuracy74.7
124
Mathematical ReasoningIn-Distribution Reasoning Performance Suite (AIME, AMC, MATH-500, Minerva, Olympiad)
AIME 2024 Score14.6
97
Image ClassificationCIFAR-10N (Worst)
Accuracy82.76
83
Image ClassificationCIFAR-10N (Aggregate)
Accuracy91.57
78
Image ClassificationCIFAR-100 Symmetric Noise (test)
Accuracy55.17
76
Image ClassificationCIFAR-10 Symmetric Noise (test)
Test Accuracy (Overall)90.24
64
General ReasoningOut-of-Distribution Performance Suite (ARC-c, GPQA*, MMLU-Pro) (test)
ARC-c Score86.5
51
Showing 10 of 23 rows

Other info

Follow for update