Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Controllable Natural Language Generation with Contrastive Prefixes

About

To guide the generation of large pretrained language models (LM), previous work has focused on directly fine-tuning the language model or utilizing an attribute discriminator. In this work, we propose a novel lightweight framework for controllable GPT2 generation, which utilizes a set of small attribute-specific vectors, called prefixes, to steer natural language generation. Different from prefix-tuning, where each prefix is trained independently, we take the relationship among prefixes into consideration and train multiple prefixes simultaneously. We propose a novel supervised method and also an unsupervised method to train the prefixes for single-aspect control while the combination of these two methods can achieve multi-aspect control. Experimental results on both single-aspect and multi-aspect control show that our methods can guide generation towards the desired attributes while keeping high linguistic quality.

Jing Qian, Li Dong, Yelong Shen, Furu Wei, Weizhu Chen• 2022

Related benchmarks

TaskDatasetResultRank
DetoxificationJigsaw (test)
Perplexity (PPL)29.28
29
Controllable Text GenerationIMDB (test)
O-PPL13.01
27
Topic Classification and Text GenerationAGNews (test)
PPL (Output)20.41
16
Multi-Aspect Controllable Text GenerationFyelp CompMCTG (Hold-Out)
Acomp92.79
12
Multi-Aspect Controllable Text GenerationFyelp ACD CompMCTG
Acomp88.84
12
Controllable Text GenerationAGNews (test)
Output Perplexity (O-PPL)20.41
12
Sentiment ControlIMDB (test)
Sentiment Accuracy (Avg)89.5
11
Topic ControlAGNews (test)
Avg Topic Accuracy86.7
11
Multi-Aspect Controllable Text GenerationCompMCTG Overall Summary Average 1.0
Aavg Score79.82
10
Multi-attribute Controlled Text GenerationCompM-CTG Original
Dist-3 (i.d.)0.701
10
Showing 10 of 27 rows

Other info

Follow for update