Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Multilingual AMR Parsing with Noisy Knowledge Distillation

About

We study multilingual AMR parsing from the perspective of knowledge distillation, where the aim is to learn and improve a multilingual AMR parser by using an existing English parser as its teacher. We constrain our exploration in a strict multilingual setting: there is but one model to parse all different languages including English. We identify that noisy input and precise output are the key to successful distillation. Together with extensive pre-training, we obtain an AMR parser whose performances surpass all previously published results on four different foreign languages, including German, Spanish, Italian, and Chinese, by large margins (up to 18.8 \textsc{Smatch} points on Chinese and on average 11.3 \textsc{Smatch} points). Our parser also achieves comparable performance on English to the latest state-of-the-art English-only parser.

Deng Cai, Xin Li, Jackie Chun-Sing Ho, Lidong Bing, Wai Lam• 2021

Related benchmarks

TaskDatasetResultRank
Cross-lingual AMR ParsingAMR German (DE) human-translated 2.0 (test)
Smatch0.731
15
Cross-lingual AMR ParsingAMR Italian (IT) human-translated 2.0 (test)
Smatch Score75.4
15
Cross-lingual AMR ParsingAMR Spanish (ES) human-translated 2.0 (test)
Smatch Score75.9
15
Cross-lingual AMR ParsingAMR Chinese (ZH) human-translated 2.0 (test)
Smatch61.9
13
Showing 4 of 4 rows

Other info

Follow for update