Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Click: Controllable Text Generation with Sequence Likelihood Contrastive Learning

About

It has always been an important yet challenging problem to control language models to avoid generating texts with undesirable attributes, such as toxic language and unnatural repetition. We introduce Click for controllable text generation, which needs no modification to the model architecture and facilitates out-of-the-box use of trained models. It employs a contrastive loss on sequence likelihood, which fundamentally decreases the generation probability of negative samples (i.e., generations with undesirable attributes). It also adopts a novel likelihood ranking-based strategy to construct contrastive samples from model generations. On the tasks of language detoxification, sentiment steering, and repetition reduction, we show that Click outperforms strong baselines of controllable text generation and demonstrate the superiority of Click's sample construction strategy.

Chujie Zheng, Pei Ke, Zheng Zhang, Minlie Huang• 2023

Related benchmarks

TaskDatasetResultRank
Sentiment SteeringOpenWebText Neutral to Negative (test)
Perplexity (PPL)51.46
27
Sentiment SteeringOpenWebText Neutral to Positive (test)
Perplexity (PPL)57.43
27
Language DetoxificationBAD (test)
Toxicity Reduction37
10
Language DetoxificationBot-Adversarial Dialogue (BAD) 1.0 (test)
Toxicity Probability0.084
10
Sentiment SteeringOpenWebText Negative prompts (test)
Positivity Score0.59
8
Sentiment SteeringOpenWebText Positive prompts (test)
Negativity Score0.6
8
Repetition reductionWikiText-103 (test)
PPL25.62
8
Language DetoxificationBAD (val)
Toxicity Proportion11
7
Showing 8 of 8 rows

Other info

Code

Follow for update