Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Unsupervised Learning via Meta-Learning

About

A central goal of unsupervised learning is to acquire representations from unlabeled data or experience that can be used for more effective learning of downstream tasks from modest amounts of labeled data. Many prior unsupervised learning works aim to do so by developing proxy objectives based on reconstruction, disentanglement, prediction, and other metrics. Instead, we develop an unsupervised meta-learning method that explicitly optimizes for the ability to learn a variety of tasks from small amounts of data. To do so, we construct tasks from unlabeled data in an automatic way and run meta-learning over the constructed tasks. Surprisingly, we find that, when integrated with meta-learning, relatively simple task construction mechanisms, such as clustering embeddings, lead to good performance on a variety of downstream, human-specified tasks. Our experiments across four image datasets indicate that our unsupervised meta-learning approach acquires a learning algorithm without any labeled data that is applicable to a wide range of downstream classification tasks, improving upon the embedding learned by four prior unsupervised learning methods.

Kyle Hsu, Sergey Levine, Chelsea Finn• 2018

Related benchmarks

TaskDatasetResultRank
Few-shot Image ClassificationMini-Imagenet (test)--
235
5-way ClassificationminiImageNet (test)
Accuracy75.54
231
Image ClassificationMiniImagenet
Accuracy53.97
206
Few-shot classificationMini-ImageNet--
175
5-way Few-shot ClassificationMini-Imagenet (test)--
141
Few-shot classificationMini-Imagenet (test)--
113
Few-shot Image ClassificationminiImageNet (test)
Accuracy69.64
111
Few-shot classificationOmniglot (test)
Accuracy87.78
109
Few-shot Image ClassificationtieredImageNet
Accuracy0.7916
90
Facial Attribute ClassificationCelebA (test)
Average Acc74.98
89
Showing 10 of 40 rows

Other info

Code

Follow for update