Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Levenshtein Transformer

About

Modern neural sequence generation models are built to either generate tokens step-by-step from scratch or (iteratively) modify a sequence of tokens bounded by a fixed length. In this work, we develop Levenshtein Transformer, a new partially autoregressive model devised for more flexible and amenable sequence generation. Unlike previous approaches, the atomic operations of our model are insertion and deletion. The combination of them facilitates not only generation but also sequence refinement allowing dynamic length changes. We also propose a set of new training techniques dedicated at them, effectively exploiting one as the other's learning signal thanks to their complementary nature. Experiments applying the proposed model achieve comparable performance but much-improved efficiency on both generation (e.g. machine translation, text summarization) and refinement tasks (e.g. automatic post-editing). We further confirm the flexibility of our model by showing a Levenshtein Transformer trained by machine translation can straightforwardly be used for automatic post-editing.

Jiatao Gu, Changhan Wang, Jake Zhao• 2019

Related benchmarks

TaskDatasetResultRank
Machine TranslationWMT En-De 2014 (test)
BLEU27.4
379
Grammatical Error CorrectionCoNLL 2014 (test)
F0.5 Score42.5
207
Machine TranslationWMT Ro-En 2016 (test)
BLEU33.26
82
Machine TranslationWMT14 En-De newstest2014 (test)
BLEU27.27
65
Grammatical Error CorrectionNLPCC-18 Chinese GEC (test)
Precision24.9
49
Machine TranslationWMT'16 Romanian-English (Ro-En) (test)
BLEU33.26
21
Text SummarizationAnnotated English Gigaword standard (test)
ROUGE-137.4
15
Machine TranslationIT En-De out-of-domain WMT14 (test)
BLEU24.7
10
Machine TranslationLaw De-En (test)
BLEU70.64
8
Machine TranslationLaw (Ko-En) (test)
BLEU51.31
8
Showing 10 of 26 rows

Other info

Code

Follow for update