Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Building One-Shot Semi-supervised (BOSS) Learning up to Fully Supervised Performance

About

Reaching the performance of fully supervised learning with unlabeled data and only labeling one sample per class might be ideal for deep learning applications. We demonstrate for the first time the potential for building one-shot semi-supervised (BOSS) learning on Cifar-10 and SVHN up to attain test accuracies that are comparable to fully supervised learning. Our method combines class prototype refining, class balancing, and self-training. A good prototype choice is essential and we propose a technique for obtaining iconic examples. In addition, we demonstrate that class balancing methods substantially improve accuracy results in semi-supervised learning to levels that allow self-training to reach the level of fully supervised learning performance. Rigorous empirical evaluations provide evidence that labeling large datasets is not necessary for training deep neural networks. We made our code available at https://github.com/lnsmith54/BOSS to facilitate replication and for use with future real-world applications.

Leslie N. Smith, Adam Conovaloff• 2020

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-10 (test)
Accuracy95.2
906
Image ClassificationSVHN (test)
Accuracy98
362
Image ClassificationCIFAR-10 long-tailed (test)
Top-1 Acc70.3
201
Image ClassificationSTL10-LT (gamma_l = 10) (test)
Accuracy76
42
Image ClassificationCIFAR100 (Nl=50, Ml=400, γl=γu=10) long-tailed (test)
Accuracy50
16
Image ClassificationCIFAR100 long-tailed (Nl=150, Ml=2000, γl=γu=10) (test)
Accuracy59.3
16
Image ClassificationSTL10 (Nl=150, M=100k, γl=10) long-tailed (test)
Accuracy66.4
16
Image ClassificationCIFAR10-LT (Nl=1500, Ml=3000, γl=γu=100) (test)
Accuracy76.5
16
Showing 8 of 8 rows

Other info

Code

Follow for update