Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Meta-Learning with Latent Embedding Optimization

About

Gradient-based meta-learning techniques are both widely applicable and proficient at solving challenging few-shot learning and fast adaptation problems. However, they have practical difficulties when operating on high-dimensional parameter spaces in extreme low-data regimes. We show that it is possible to bypass these limitations by learning a data-dependent latent generative representation of model parameters, and performing gradient-based meta-learning in this low-dimensional latent space. The resulting approach, latent embedding optimization (LEO), decouples the gradient-based adaptation procedure from the underlying high-dimensional space of model parameters. Our evaluation shows that LEO can achieve state-of-the-art performance on the competitive miniImageNet and tieredImageNet few-shot classification tasks. Further analysis indicates LEO is able to capture uncertainty in the data, and can perform adaptation more effectively by optimizing in latent space.

Andrei A. Rusu, Dushyant Rao, Jakub Sygnowski, Oriol Vinyals, Razvan Pascanu, Simon Osindero, Raia Hadsell• 2018

Related benchmarks

TaskDatasetResultRank
Few-shot classificationtieredImageNet (test)
Accuracy81.44
282
Few-shot Image ClassificationMini-Imagenet (test)
Accuracy79.49
235
5-way ClassificationminiImageNet (test)
Accuracy77.6
231
Few-shot classificationMini-ImageNet
1-shot Acc61.8
175
5-way Few-shot ClassificationMiniImagenet
Accuracy (5-shot)77.59
150
5-way Few-shot ClassificationMini-Imagenet (test)
1-shot Accuracy61.76
141
Few-shot classificationminiImageNet standard (test)
5-way 1-shot Acc63.97
138
Few-shot classificationminiImageNet (test)
Accuracy77.59
120
5-way Image ClassificationtieredImageNet 5-way (test)
1-shot Acc66.33
117
Few-shot Image ClassificationminiImageNet (test)--
111
Showing 10 of 36 rows

Other info

Follow for update