Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Multi-Source Domain Adaptation with Mixture of Experts

About

We propose a mixture-of-experts approach for unsupervised domain adaptation from multiple sources. The key idea is to explicitly capture the relationship between a target example and different source domains. This relationship, expressed by a point-to-set metric, determines how to combine predictors trained on various domains. The metric is learned in an unsupervised fashion using meta-training. Experimental results on sentiment analysis and part-of-speech tagging demonstrate that our approach consistently outperforms multiple baselines and can robustly handle negative transfer.

Jiang Guo, Darsh J Shah, Regina Barzilay• 2018

Related benchmarks

TaskDatasetResultRank
Natural Language InferenceXNLI (test)
Average Accuracy62.18
167
Named Entity RecognitionWikiAnn (test)
Average Accuracy68.6
58
Review Rating ClassificationAmazon Reviews en, es, fr
Accuracy (de)50.94
6
Review Rating ClassificationAmazon Reviews en ja zh
Acc (de)0.4969
6
Showing 4 of 4 rows

Other info

Follow for update