Smooth Neighbors on Teacher Graphs for Semi-supervised Learning
About
The recently proposed self-ensembling methods have achieved promising results in deep semi-supervised learning, which penalize inconsistent predictions of unlabeled data under different perturbations. However, they only consider adding perturbations to each single data point, while ignoring the connections between data samples. In this paper, we propose a novel method, called Smooth Neighbors on Teacher Graphs (SNTG). In SNTG, a graph is constructed based on the predictions of the teacher model, i.e., the implicit self-ensemble of models. Then the graph serves as a similarity measure with respect to which the representations of "similar" neighboring points are learned to be smooth on the low-dimensional manifold. We achieve state-of-the-art results on semi-supervised learning benchmarks. The error rates are 9.89%, 3.99% for CIFAR-10 with 4000 labels, SVHN with 500 labels, respectively. In particular, the improvements are significant when the labels are fewer. For the non-augmented MNIST with only 20 labels, the error rate is reduced from previous 4.81% to 1.36%. Our method also shows robustness to noisy labels.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Image Classification | CIFAR-10 (test) | -- | 3381 | |
| Image Classification | CIFAR-10 | -- | 507 | |
| Image Classification | CIFAR10 | -- | 70 | |
| Image Classification | SVHN 1000 labels (test) | Error Rate3.86 | 69 | |
| Image Classification | SVHN 250 labels | Test Error Rate4.29 | 61 | |
| Image Classification | CIFAR-10 4,000 labels (test) | Test Error Rate10.93 | 57 | |
| Classification | SVHN | Error Rate3.83 | 21 | |
| Image Classification | CIFAR-100 | Top-1 Error Rate37.97 | 18 | |
| Image Classification | SVHN 1,000 labels (train) | Error Rate (%)4.02 | 15 | |
| Image Classification | CIFAR10 4,000 labels (train) | Error Rate12.49 | 15 |