Milking CowMask for Semi-Supervised Image Classification
About
Consistency regularization is a technique for semi-supervised learning that underlies a number of strong results for classification with few labeled data. It works by encouraging a learned model to be robust to perturbations on unlabeled data. Here, we present a novel mask-based augmentation method called CowMask. Using it to provide perturbations for semi-supervised consistency regularization, we achieve a state-of-the-art result on ImageNet with 10% labeled data, with a top-5 error of 8.76% and top-1 error of 26.06%. Moreover, we do so with a method that is much simpler than many alternatives. We further investigate the behavior of CowMask for semi-supervised learning by running many smaller scale experiments on the SVHN, CIFAR-10 and CIFAR-100 data sets, where we achieve results competitive with the state of the art, indicating that CowMask is widely applicable. We open source our code at https://github.com/google-research/google-research/tree/master/milking_cowmask
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Image Classification | CIFAR-100 (test) | -- | 3518 | |
| Image Classification | CIFAR-10 (test) | -- | 906 | |
| Image Classification | SVHN (test) | -- | 362 | |
| Image Classification | ImageNet (10% labels) | Top-1 Acc73.9 | 98 | |
| Image Classification | ImageNet 1k (10% labels) | Top-1 Acc73.9 | 92 | |
| Image Classification | ImageNet (10%) | Top-1 Acc73.9 | 32 | |
| Image Classification | ImageNet 10% label fraction | Top-5 Acc91.2 | 23 | |
| Image Classification | ImageNet 10% labels (test val) | Top-5 Error Rate8.76 | 10 |