Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Gradient-Based Meta-Learning with Learned Layerwise Metric and Subspace

About

Gradient-based meta-learning methods leverage gradient descent to learn the commonalities among various tasks. While previous such methods have been successful in meta-learning tasks, they resort to simple gradient descent during meta-testing. Our primary contribution is the {\em MT-net}, which enables the meta-learner to learn on each layer's activation space a subspace that the task-specific learner performs gradient descent on. Additionally, a task-specific learner of an {\em MT-net} performs gradient descent with respect to a meta-learned distance metric, which warps the activation space to be more sensitive to task identity. We demonstrate that the dimension of this learned subspace reflects the complexity of the task-specific learner's adaptation task, and also that our model is less sensitive to the choice of initial learning rates than previous gradient-based meta-learning methods. Our method achieves state-of-the-art or comparable performance on few-shot classification and regression tasks.

Yoonho Lee, Seungjin Choi• 2018

Related benchmarks

TaskDatasetResultRank
Image ClassificationAircraft
Accuracy63.03
302
5-way Few-shot ClassificationminiImageNet standard (test)
Accuracy51.7
91
Few-shot Image ClassificationtieredImageNet (test)
Accuracy51.95
86
5-way Image ClassificationMiniImagenet
One-shot Accuracy51.7
67
Image ClassificationminiImageNet standard (test)
Accuracy49.75
61
Image ClassificationBird
Accuracy69.22
29
Image ClassificationFungi
Accuracy53.49
18
ClassificationTexture
Accuracy46.57
17
Toy RegressionToy Regression 5-shot (test)
MSE2.435
6
Toy RegressionToy Regression 10-shot (test)
MSE0.967
6
Showing 10 of 10 rows

Other info

Follow for update