Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

How to train your MAML

About

The field of few-shot learning has recently seen substantial advancements. Most of these advancements came from casting few-shot learning as a meta-learning problem. Model Agnostic Meta Learning or MAML is currently one of the best approaches for few-shot learning via meta-learning. MAML is simple, elegant and very powerful, however, it has a variety of issues, such as being very sensitive to neural network architectures, often leading to instability during training, requiring arduous hyperparameter searches to stabilize training and achieve high generalization and being very computationally expensive at both training and inference times. In this paper, we propose various modifications to MAML that not only stabilize the system, but also substantially improve the generalization performance, convergence speed and computational overhead of MAML, which we call MAML++.

Antreas Antoniou, Harrison Edwards, Amos Storkey• 2018

Related benchmarks

TaskDatasetResultRank
Few-shot Image ClassificationMini-Imagenet (test)--
235
5-way Few-shot ClassificationMiniImagenet
Accuracy (5-shot)68.32
150
Few-shot classificationminiImageNet standard (test)
5-way 1-shot Acc52.15
138
Few-shot classificationOmniglot (test)--
109
5-way Few-shot ClassificationminiImageNet standard (test)
Accuracy68.32
91
Few-shot classificationMini-Imagenet 5-way 5-shot
Accuracy68.32
87
Few-shot classificationMini-ImageNet 1-shot 5-way (test)
Accuracy52.15
82
5-way 5-shot ClassificationOmniglot (test)
Accuracy99.93
49
Few-shot classificationOmniglot 20-way 1-shot (test)
Accuracy97.65
43
Few-shot classificationOmniglot 20-way 5-shot (test)
Accuracy99.33
43
Showing 10 of 19 rows

Other info

Follow for update