Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Simultaneous Deep Transfer Across Domains and Tasks

About

Recent reports suggest that a generic supervised deep CNN model trained on a large-scale dataset reduces, but does not remove, dataset bias. Fine-tuning deep models in a new domain can require a significant amount of labeled data, which for many applications is simply not available. We propose a new CNN architecture to exploit unlabeled and sparsely labeled target domain data. Our approach simultaneously optimizes for domain invariance to facilitate domain transfer and uses a soft label distribution matching loss to transfer information between tasks. Our proposed adaptation method offers empirical performance which exceeds previously published results on two standard benchmark visual domain adaptation tasks, evaluated across supervised and semi-supervised adaptation settings.

Eric Tzeng, Judy Hoffman, Trevor Darrell, Kate Saenko• 2015

Related benchmarks

TaskDatasetResultRank
Image ClassificationOffice-31
Average Accuracy82.21
261
Image ClassificationSVHN to MNIST (test)
Accuracy68.1
66
Image ClassificationMNIST -> USPS (test)
Accuracy79.1
64
Image ClassificationUSPS -> MNIST (test)
Accuracy66.5
63
Unsupervised Domain Adaptation ClassificationOffice-31 (test)
Accuracy (A->W)89.8
51
Unsupervised Domain AdaptationSVHN → MNIST (test)
Accuracy71.1
41
Movie Fill-in-the-BlankLSMDC 2016 (test)
Accuracy33.2
34
Blond Hair classificationCelebA (test)
Average Group Accuracy82.4
30
Unsupervised Domain AdaptationMNIST -> USPS (test)
Accuracy0.811
28
Unsupervised Domain AdaptationSYN SIGNS to GTSRB (test)
Accuracy91.1
25
Showing 10 of 19 rows

Other info

Follow for update