Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Cross Modal Distillation for Supervision Transfer

About

In this work we propose a technique that transfers supervision between images from different modalities. We use learned representations from a large labeled modality as a supervisory signal for training representations for a new unlabeled paired modality. Our method enables learning of rich representations for unlabeled modalities and can be used as a pre-training procedure for new modalities with limited labeled data. We show experimental results where we transfer supervision from labeled RGB images to unlabeled depth and optical flow images and demonstrate large improvements for both these cross modal supervision transfers. Code, data and pre-trained models are available at https://github.com/s-gupta/fast-rcnn/tree/distillation

Saurabh Gupta, Judy Hoffman, Jitendra Malik• 2015

Related benchmarks

TaskDatasetResultRank
Object DetectionNYUD v2 (test)--
24
Showing 1 of 1 rows

Other info

Follow for update