Meta-Learning with Latent Embedding Optimization
About
Gradient-based meta-learning techniques are both widely applicable and proficient at solving challenging few-shot learning and fast adaptation problems. However, they have practical difficulties when operating on high-dimensional parameter spaces in extreme low-data regimes. We show that it is possible to bypass these limitations by learning a data-dependent latent generative representation of model parameters, and performing gradient-based meta-learning in this low-dimensional latent space. The resulting approach, latent embedding optimization (LEO), decouples the gradient-based adaptation procedure from the underlying high-dimensional space of model parameters. Our evaluation shows that LEO can achieve state-of-the-art performance on the competitive miniImageNet and tieredImageNet few-shot classification tasks. Further analysis indicates LEO is able to capture uncertainty in the data, and can perform adaptation more effectively by optimizing in latent space.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Few-shot classification | tieredImageNet (test) | Accuracy81.44 | 282 | |
| Few-shot Image Classification | Mini-Imagenet (test) | Accuracy79.49 | 235 | |
| 5-way Classification | miniImageNet (test) | Accuracy77.6 | 231 | |
| Few-shot classification | Mini-ImageNet | 1-shot Acc61.8 | 175 | |
| 5-way Few-shot Classification | MiniImagenet | Accuracy (5-shot)77.59 | 150 | |
| 5-way Few-shot Classification | Mini-Imagenet (test) | 1-shot Accuracy61.76 | 141 | |
| Few-shot classification | miniImageNet standard (test) | 5-way 1-shot Acc63.97 | 138 | |
| Few-shot classification | miniImageNet (test) | Accuracy77.59 | 120 | |
| 5-way Image Classification | tieredImageNet 5-way (test) | 1-shot Acc66.33 | 117 | |
| Few-shot Image Classification | miniImageNet (test) | -- | 111 |