Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Latent-Variable Non-Autoregressive Neural Machine Translation with Deterministic Inference Using a Delta Posterior

About

Although neural machine translation models reached high translation quality, the autoregressive nature makes inference difficult to parallelize and leads to high translation latency. Inspired by recent refinement-based approaches, we propose LaNMT, a latent-variable non-autoregressive model with continuous latent variables and deterministic inference procedure. In contrast to existing approaches, we use a deterministic inference algorithm to find the target sequence that maximizes the lowerbound to the log-probability. During inference, the length of translation automatically adapts itself. Our experiments show that the lowerbound can be greatly increased by running the inference algorithm, resulting in significantly improved translation quality. Our proposed model closes the performance gap between non-autoregressive and autoregressive approaches on ASPEC Ja-En dataset with 8.6x faster decoding. On WMT'14 En-De dataset, our model narrows the gap with autoregressive baseline to 2.0 BLEU points with 12.5x speedup. By decoding multiple initial latent variables in parallel and rescore using a teacher model, the proposed model further brings the gap down to 1.0 BLEU point on WMT'14 En-De task with 6.8x speedup.

Raphael Shu, Jason Lee, Hideki Nakayama, Kyunghyun Cho• 2019

Related benchmarks

TaskDatasetResultRank
Machine TranslationWMT En-De 2014 (test)
BLEU11.8
379
Machine TranslationWMT 2014 (test)
BLEU24.2
100
Machine TranslationWMT En-De '14
BLEU11.8
89
Machine TranslationWMT Ro-En 2016 (test)
BLEU29.1
82
Machine TranslationWMT14 En-De newstest2014 (test)
BLEU26.3
65
Showing 5 of 5 rows

Other info

Follow for update