Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Gromov-Wasserstein Alignment of Word Embedding Spaces

About

Cross-lingual or cross-domain correspondences play key roles in tasks ranging from machine translation to transfer learning. Recently, purely unsupervised methods operating on monolingual embeddings have become effective alignment tools. Current state-of-the-art methods, however, involve multiple steps, including heuristic post-hoc refinement strategies. In this paper, we cast the correspondence problem directly as an optimal transport (OT) problem, building on the idea that word embeddings arise from metric recovery algorithms. Indeed, we exploit the Gromov-Wasserstein distance that measures how similarities between pairs of words relate across languages. We show that our OT objective can be estimated efficiently, requires little or no tuning, and results in performance comparable with the state-of-the-art in various unsupervised word translation tasks.

David Alvarez-Melis, Tommi S. Jaakkola• 2018

Related benchmarks

TaskDatasetResultRank
Multilingual AlignmentXLING standard (test)
Alignment Score (en-de)66.7
8
Cross-lingual Word AlignmentMUSE
Alignment Score (IT-EN)80.38
7
Showing 2 of 2 rows

Other info

Follow for update