Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

NeuroLogic Decoding: (Un)supervised Neural Text Generation with Predicate Logic Constraints

About

Conditional text generation often requires lexical constraints, i.e., which words should or shouldn't be included in the output text. While the dominant recipe for conditional text generation has been large-scale pretrained language models that are finetuned on the task-specific training data, such models do not learn to follow the underlying constraints reliably, even when supervised with large amounts of task-specific examples. We propose NeuroLogic Decoding, a simple yet effective algorithm that enables neural language models -- supervised or not -- to generate fluent text while satisfying complex lexical constraints. Our approach is powerful yet efficient. It handles any set of lexical constraints that is expressible under predicate logic, while its asymptotic runtime is equivalent to conventional beam search. Empirical results on four benchmarks show that NeuroLogic Decoding outperforms previous approaches, including algorithms that handle a subset of our constraints. Moreover, we find that unsupervised models with NeuroLogic Decoding often outperform supervised models with conventional decoding, even when the latter is based on considerably larger networks. Our results suggest the limit of large-scale neural networks for fine-grained controllable generation and the promise of inference-time algorithms.

Ximing Lu, Peter West, Rowan Zellers, Ronan Le Bras, Chandra Bhagavatula, Yejin Choi• 2020

Related benchmarks

TaskDatasetResultRank
Table-to-text generationE2ENLG (test)--
37
Constrained DecodingLogicBench
Constraint Satisfaction96.2
7
Constrained DecodingGSM8K Constrained
Constraint Satisfaction94.1
3
Lexically constrained decodingCommonGen
Count3.3
3
Showing 4 of 4 rows

Other info

Follow for update