Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow

About

Most sequence-to-sequence (seq2seq) models are autoregressive; they generate each token by conditioning on previously generated tokens. In contrast, non-autoregressive seq2seq models generate all tokens in one pass, which leads to increased efficiency through parallel processing on hardware such as GPUs. However, directly modeling the joint distribution of all tokens simultaneously is challenging, and even with increasingly complex model structures accuracy lags significantly behind autoregressive models. In this paper, we propose a simple, efficient, and effective model for non-autoregressive sequence generation using latent variable models. Specifically, we turn to generative flow, an elegant technique to model complex distributions using neural networks, and design several layers of flow tailored for modeling the conditional density of sequential latent variables. We evaluate this model on three neural machine translation (NMT) benchmark datasets, achieving comparable performance with state-of-the-art non-autoregressive NMT models and almost constant decoding time w.r.t the sequence length.

Xuezhe Ma, Chunting Zhou, Xian Li, Graham Neubig, Eduard Hovy• 2019

Related benchmarks

TaskDatasetResultRank
Machine TranslationWMT En-De 2014 (test)
BLEU28.39
379
Machine TranslationIWSLT De-En 2014 (test)
BLEU27.55
146
Machine TranslationWMT 2014 (test)
BLEU30.68
100
Machine TranslationWMT En-De '14
BLEU23.72
89
Machine TranslationWMT Ro-En 2016 (test)
BLEU32.84
82
Machine TranslationWMT14 En-De newstest2014 (test)
BLEU25.31
65
Machine TranslationWMT De-En 14 (test)
BLEU30.68
59
Machine TranslationWMT 2016 (test)
BLEU32.91
58
Machine TranslationWMT16 EN-RO (test)
BLEU32.2
56
Machine TranslationWMT De-En 14
BLEU28.39
33
Showing 10 of 13 rows

Other info

Code

Follow for update