Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Self-Supervised Prototypical Transfer Learning for Few-Shot Classification

About

Most approaches in few-shot learning rely on costly annotated data related to the goal task domain during (pre-)training. Recently, unsupervised meta-learning methods have exchanged the annotation requirement for a reduction in few-shot classification performance. Simultaneously, in settings with realistic domain shift, common transfer learning has been shown to outperform supervised meta-learning. Building on these insights and on advances in self-supervised learning, we propose a transfer learning approach which constructs a metric embedding that clusters unlabeled prototypical samples and their augmentations closely together. This pre-trained embedding is a starting point for few-shot classification by summarizing class clusters and fine-tuning. We demonstrate that our self-supervised prototypical transfer learning approach ProtoTransfer outperforms state-of-the-art unsupervised meta-learning methods on few-shot tasks from the mini-ImageNet dataset. In few-shot experiments with domain shift, our approach even has comparable performance to supervised methods, but requires orders of magnitude fewer labels.

Carlos Medina, Arnout Devos, Matthias Grossglauser• 2020

Related benchmarks

TaskDatasetResultRank
Few-shot Image ClassificationMini-Imagenet (test)--
235
Image ClassificationMiniImagenet
Accuracy63.01
206
Few-shot classificationMini-Imagenet (test)--
113
Few-shot Image ClassificationminiImageNet (test)--
111
Few-shot classificationOmniglot (test)
Accuracy99.08
109
Few-shot classificationMiniImagenet
5-way 5-shot Accuracy62.99
98
5-way ClassificationEuroSAT--
51
Few-shot Image ClassificationCropDiseases CDFSL (test)--
45
Cross-domain few-shot classificationCD-FSL benchmark--
33
N-way K-shot classificationMini-Imagenet (test)
Accuracy (5-way, 1-shot)45.67
26
Showing 10 of 16 rows

Other info

Code

Follow for update