Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

CTRL: A Conditional Transformer Language Model for Controllable Generation

About

Large-scale language models show promising text generation capabilities, but users cannot easily control particular aspects of the generated text. We release CTRL, a 1.63 billion-parameter conditional transformer language model, trained to condition on control codes that govern style, content, and task-specific behavior. Control codes were derived from structure that naturally co-occurs with raw text, preserving the advantages of unsupervised learning while providing more explicit control over text generation. These codes also allow CTRL to predict which parts of the training data are most likely given a sequence. This provides a potential method for analyzing large amounts of data via model-based source attribution. We have released multiple full-sized, pretrained versions of CTRL at https://github.com/salesforce/ctrl.

Nitish Shirish Keskar, Bryan McCann, Lav R. Varshney, Caiming Xiong, Richard Socher• 2019

Related benchmarks

TaskDatasetResultRank
Sentiment SteeringOpenWebText Neutral to Negative (test)
Perplexity (PPL)35.94
27
Sentiment SteeringOpenWebText Neutral to Positive (test)
Perplexity (PPL)43.79
27
Class-Conditional Language GenerationAG-News
MAUVE (World)0.806
16
Attribute-Controlled Dialogue GenerationDailyDialog-CG (test)
Emotion Accuracy (E-ACC)67.34
12
Multi-Aspect Controllable Text GenerationFyelp CompMCTG (Hold-Out)
Acomp82.02
12
Multi-Aspect Controllable Text GenerationFyelp ACD CompMCTG
Acomp74.63
12
Multi-attribute Conditional Text GenerationCompMCTG Compositional Few-Shot 1.0 (test)
Accuracy65.94
10
Multi-Aspect Controllable Text GenerationCompMCTG Overall Summary Average 1.0
Aavg Score76.17
10
Multi-Constraint Text GenerationCompMCTG Average 1.0
Relevance (avg)3.77
10
Multi-Aspect Controllable Text GenerationCompMCTG 1.0 (Original)
Aid Score79.1
10
Showing 10 of 23 rows

Other info

Follow for update