Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Zero-shot Dependency Parsing with Pre-trained Multilingual Sentence Representations

About

We investigate whether off-the-shelf deep bidirectional sentence representations trained on a massively multilingual corpus (multilingual BERT) enable the development of an unsupervised universal dependency parser. This approach only leverages a mix of monolingual corpora in many languages and does not require any translation data making it applicable to low-resource languages. In our experiments we outperform the best CoNLL 2018 language-specific systems in all of the shared task's six truly low-resource languages while using a single system. However, we also find that (i) parsing accuracy still varies dramatically when changing the training languages and (ii) in some target languages zero-shot transfer fails under all tested conditions, raising concerns on the 'universality' of the whole approach.

Ke Tran, Arianna Bisazza• 2019

Related benchmarks

TaskDatasetResultRank
Dependency ParsingUniversal Dependencies Low-Resource Languages (unseen test)
LAS (Breton)52.62
12
Dependency ParsingUniversal Dependencies High-Resource Languages (unseen test)
LAS (Finnish)62.29
10
Showing 2 of 2 rows

Other info

Follow for update