Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Data Distillation: Towards Omni-Supervised Learning

About

We investigate omni-supervised learning, a special regime of semi-supervised learning in which the learner exploits all available labeled data plus internet-scale sources of unlabeled data. Omni-supervised learning is lower-bounded by performance on existing labeled datasets, offering the potential to surpass state-of-the-art fully supervised methods. To exploit the omni-supervised setting, we propose data distillation, a method that ensembles predictions from multiple transformations of unlabeled data, using a single model, to automatically generate new training annotations. We argue that visual recognition models have recently become accurate enough that it is now possible to apply classic ideas about self-training to challenging real-world data. Our experimental results show that in the cases of human keypoint detection and general object detection, state-of-the-art models trained with data distillation surpass the performance of using labeled data from the COCO dataset alone.

Ilija Radosavovic, Piotr Doll\'ar, Ross Girshick, Georgia Gkioxari, Kaiming He• 2017

Related benchmarks

TaskDatasetResultRank
Object DetectionPASCAL VOC 2007 (test)--
821
2D Human Pose EstimationCOCO 2017 (val)
AP56.6
386
Object DetectionCOCO (minival)
mAP37.9
184
Object DetectionMS-COCO 2014 (minival)
mAP33.1
23
Instance SegmentationCOCO 2017 (val random split)
Mask AP0.242
12
Instance SegmentationCOCO 1% labels (val)
AP3.8
7
Instance SegmentationCOCO 2% labels (val)
AP11.8
7
Instance SegmentationCOCO 5% labels (val)
AP20.4
7
Instance SegmentationCOCO 10% labels (val)
AP24.2
7
Hand Pose EstimationFPHA (test)
Hand AUC (Joint)74.9
3
Showing 10 of 10 rows

Other info

Follow for update