Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Learning Neural Templates for Text Generation

About

While neural, encoder-decoder models have had significant empirical success in text generation, there remain several unaddressed problems with this style of generation. Encoder-decoder models are largely (a) uninterpretable, and (b) difficult to control in terms of their phrasing or content. This work proposes a neural generation system using a hidden semi-markov model (HSMM) decoder, which learns latent, discrete templates jointly with learning to generate. We show that this model learns useful templates, and that these templates make generation both more interpretable and controllable. Furthermore, we show that this approach scales to real data sets and achieves strong performance nearing that of encoder-decoder text generation models.

Sam Wiseman, Stuart M. Shieber, Alexander M. Rush• 2018

Related benchmarks

TaskDatasetResultRank
Data-to-text generationE2E (test)
BLEU59.8
33
Data-to-text generationWIKIBIO (test)
BLEU35.17
17
Showing 2 of 2 rows

Other info

Follow for update