Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

A robust self-learning method for fully unsupervised cross-lingual mappings of word embeddings

About

Recent work has managed to learn cross-lingual word embeddings without parallel data by mapping monolingual embeddings to a shared space through adversarial training. However, their evaluation has focused on favorable conditions, using comparable corpora or closely-related languages, and we show that they often fail in more realistic scenarios. This work proposes an alternative approach based on a fully unsupervised initialization that explicitly exploits the structural similarity of the embeddings, and a robust self-learning algorithm that iteratively improves this solution. Our method succeeds in all tested scenarios and obtains the best published results in standard datasets, even surpassing previous supervised systems. Our implementation is released as an open source project at https://github.com/artetxem/vecmap

Mikel Artetxe, Gorka Labaka, Eneko Agirre• 2018

Related benchmarks

TaskDatasetResultRank
Bilingual Lexicon InductionXLING (test)
Average Score46.55
15
Unsupervised Bilingual Lexicon InductionPanLex-BLI 6 directions
BG Score37.22
11
Bilingual Lexicon InductionBUCC 2020 (test)
DE-EN Score37.1
10
Multilingual AlignmentXLING standard (test)
Alignment Score (en-de)52.1
8
Bilingual Lexicon InductionPanLex-BLI HU-EU (test)
P@120.03
7
Bilingual Lexicon InductionPanLex-BLI EU-ET (test)
P@19.83
7
Bilingual Lexicon InductionPanLex-BLI BG-CA (test)
Precision@139.43
6
Bilingual Lexicon InductionPanLex-BLI CA-HE (test)
P@124.64
6
Bilingual Lexicon InductionPanLex-BLI HE-BG (test)
Precision@131.55
6
Bilingual Lexicon InductionPanLex-BLI ET-HU (test)
P@135.55
6
Showing 10 of 12 rows

Other info

Follow for update