Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Enhancing Few-Shot Image Classification with Unlabelled Examples

About

We develop a transductive meta-learning method that uses unlabelled instances to improve few-shot image classification performance. Our approach combines a regularized Mahalanobis-distance-based soft k-means clustering procedure with a modified state of the art neural adaptive feature extractor to achieve improved test-time classification accuracy using unlabelled data. We evaluate our method on transductive few-shot learning tasks, in which the goal is to jointly predict labels for query (test) examples given a set of support (training) examples. We achieve state of the art performance on the Meta-Dataset, mini-ImageNet and tiered-ImageNet benchmarks. All trained models and code have been made publicly available at github.com/plai-group/simple-cnaps.

Peyman Bateni, Jarred Barber, Jan-Willem van de Meent, Frank Wood• 2020

Related benchmarks

TaskDatasetResultRank
Few-shot classificationtieredImageNet (test)--
282
Few-shot Image ClassificationMini-Imagenet (test)--
235
Image ClassificationMiniImagenet
Accuracy73.1
206
Few-shot classificationMini-ImageNet
1-shot Acc79.9
175
Few-shot Image ClassificationminiImageNet (test)--
111
Few-shot classificationMiniImagenet
5-way 5-shot Accuracy73.1
98
Few-shot Image ClassificationtieredImageNet--
90
Few-shot classificationMini-Imagenet 5-way 5-shot
Accuracy91.5
87
Few-shot Image ClassificationtieredImageNet (test)--
86
Few-shot classificationMeta-Dataset
Avg Seen Accuracy75.1
45
Showing 10 of 12 rows

Other info

Code

Follow for update