Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

BilBOWA: Fast Bilingual Distributed Representations without Word Alignments

About

We introduce BilBOWA (Bilingual Bag-of-Words without Alignments), a simple and computationally-efficient model for learning bilingual distributed representations of words which can scale to large monolingual datasets and does not require word-aligned parallel training data. Instead it trains directly on monolingual data and extracts a bilingual signal from a smaller set of raw-text sentence-aligned data. This is achieved using a novel sampled bag-of-words cross-lingual objective, which is used to regularize two noise-contrastive language models for efficient cross-lingual feature learning. We show that bilingual embeddings learned using the proposed model outperform state-of-the-art methods on a cross-lingual document classification task as well as a lexical translation task on WMT11 data.

Stephan Gouws, Yoshua Bengio, Greg Corrado• 2014

Related benchmarks

TaskDatasetResultRank
Crosslingual Document ClassificationRCV1 RCV2 EN -> DE 1,000 documents per language (test)
Accuracy86.5
27
Crosslingual Document ClassificationRCV1 RCV2 DE -> EN 1,000 documents per language (test)
Accuracy75
27
Cross-lingual Document ClassificationReuters de -> en
Accuracy75
13
Cross-lingual Document ClassificationReuters en -> de (test)
Accuracy86.5
7
Word TranslationWMT11 English-Spanish (En->Sp) (test)
P@139
4
Word TranslationWMT11 English-Spanish (Sp->En) (test)
P@144
4
Showing 6 of 6 rows

Other info

Code

Follow for update