Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

KERMIT: Generative Insertion-Based Modeling for Sequences

About

We present KERMIT, a simple insertion-based approach to generative modeling for sequences and sequence pairs. KERMIT models the joint distribution and its decompositions (i.e., marginals and conditionals) using a single neural network and, unlike much prior work, does not rely on a prespecified factorization of the data distribution. During training, one can feed KERMIT paired data $(x, y)$ to learn the joint distribution $p(x, y)$, and optionally mix in unpaired data $x$ or $y$ to refine the marginals $p(x)$ or $p(y)$. During inference, we have access to the conditionals $p(x \mid y)$ and $p(y \mid x)$ in both directions. We can also sample from the joint distribution or the marginals. The model supports both serial fully autoregressive decoding and parallel partially autoregressive decoding, with the latter exhibiting an empirically logarithmic runtime. We demonstrate through experiments in machine translation, representation learning, and zero-shot cloze question answering that our unified approach is capable of matching or exceeding the performance of dedicated state-of-the-art systems across a wide range of tasks without the need for problem-specific architectural adaptation.

William Chan, Nikita Kitaev, Kelvin Guu, Mitchell Stern, Jakob Uszkoreit• 2019

Related benchmarks

TaskDatasetResultRank
Natural Language UnderstandingGLUE
SST-294.2
452
Machine TranslationWMT En-De 2014 (test)
BLEU28.1
379
Machine TranslationWMT English-German 2014 (test)
BLEU31.4
136
Question AnsweringSQuAD (dev)
F130.3
74
Machine TranslationWMT14 DE-EN (test)
BLEU28.6
28
Machine TranslationEn -> De (test)
BLEU Score27.8
23
Cloze Question AnsweringSQuAD
Exact Match20.9
5
Representation LearningGLUE (test)
GLUE Score79.8
3
Showing 8 of 8 rows

Other info

Follow for update