SimpleShot: Revisiting Nearest-Neighbor Classification for Few-Shot Learning
About
Few-shot learners aim to recognize new object classes based on a small number of labeled training examples. To prevent overfitting, state-of-the-art few-shot learners use meta-learning on convolutional-network features and perform classification using a nearest-neighbor classifier. This paper studies the accuracy of nearest-neighbor baselines without meta-learning. Surprisingly, we find simple feature transformations suffice to obtain competitive few-shot learning accuracies. For example, we find that a nearest-neighbor classifier used in combination with mean-subtraction and L2-normalization outperforms prior results in three out of five settings on the miniImageNet dataset.
Yan Wang, Wei-Lun Chao, Kilian Q. Weinberger, Laurens van der Maaten• 2019
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Few-shot classification | tieredImageNet (test) | -- | 282 | |
| Few-shot Image Classification | Mini-Imagenet (test) | Accuracy81.5 | 235 | |
| Class-incremental learning | CIFAR-100 | Averaged Incremental Accuracy43.8 | 234 | |
| 5-way Classification | miniImageNet (test) | Accuracy80.43 | 231 | |
| Image Classification | MiniImagenet | Accuracy66.92 | 206 | |
| Few-shot classification | Mini-ImageNet | 1-shot Acc63.5 | 175 | |
| 5-way Few-shot Classification | MiniImagenet | Accuracy (5-shot)82.09 | 150 | |
| Few-shot classification | CUB (test) | Accuracy81.3 | 145 | |
| Few-shot classification | miniImageNet standard (test) | -- | 138 | |
| Few-shot classification | miniImageNet (test) | Accuracy68.1 | 120 |
Showing 10 of 78 rows
...