Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Low-shot learning with large-scale diffusion

About

This paper considers the problem of inferring image labels from images when only a few annotated examples are available at training time. This setup is often referred to as low-shot learning, where a standard approach is to re-train the last few layers of a convolutional neural network learned on separate classes for which training examples are abundant. We consider a semi-supervised setting based on a large collection of images to support label propagation. This is possible by leveraging the recent advances on large-scale similarity graph construction. We show that despite its conceptual simplicity, scaling label propagation up to hundred millions of images leads to state of the art accuracy in the low-shot learning regime.

Matthijs Douze, Arthur Szlam, Bharath Hariharan, Herv\'e J\'egou• 2017

Related benchmarks

TaskDatasetResultRank
Generalized Few-Shot LearningImageNet 2012 (Novel classes)
Top-5 Accuracy57.6
70
Few-shot Image ClassificationImageNet FS (novel)
Top-5 Acc0.776
59
Low-shot Image ClassificationImageNet-1k (val)
Top-5 Accuracy86.3
40
Image ClassificationImageNet FS
Top-5 Acc (Novel, 1-shot)57.7
13
Image ClassificationImageNet in-domain--
5
Showing 5 of 5 rows

Other info

Code

Follow for update