Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Smooth Neighbors on Teacher Graphs for Semi-supervised Learning

About

The recently proposed self-ensembling methods have achieved promising results in deep semi-supervised learning, which penalize inconsistent predictions of unlabeled data under different perturbations. However, they only consider adding perturbations to each single data point, while ignoring the connections between data samples. In this paper, we propose a novel method, called Smooth Neighbors on Teacher Graphs (SNTG). In SNTG, a graph is constructed based on the predictions of the teacher model, i.e., the implicit self-ensemble of models. Then the graph serves as a similarity measure with respect to which the representations of "similar" neighboring points are learned to be smooth on the low-dimensional manifold. We achieve state-of-the-art results on semi-supervised learning benchmarks. The error rates are 9.89%, 3.99% for CIFAR-10 with 4000 labels, SVHN with 500 labels, respectively. In particular, the improvements are significant when the labels are fewer. For the non-augmented MNIST with only 20 labels, the error rate is reduced from previous 4.81% to 1.36%. Our method also shows robustness to noisy labels.

Yucen Luo, Jun Zhu, Mengxi Li, Yong Ren, Bo Zhang• 2017

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-10 (test)--
3381
Image ClassificationCIFAR-10--
507
Image ClassificationCIFAR10--
70
Image ClassificationSVHN 1000 labels (test)
Error Rate3.86
69
Image ClassificationSVHN 250 labels
Test Error Rate4.29
61
Image ClassificationCIFAR-10 4,000 labels (test)
Test Error Rate10.93
57
ClassificationSVHN
Error Rate3.83
21
Image ClassificationCIFAR-100
Top-1 Error Rate37.97
18
Image ClassificationSVHN 1,000 labels (train)
Error Rate (%)4.02
15
Image ClassificationCIFAR10 4,000 labels (train)
Error Rate12.49
15
Showing 10 of 13 rows

Other info

Follow for update