Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Polyjuice: Generating Counterfactuals for Explaining, Evaluating, and Improving Models

About

While counterfactual examples are useful for analysis and training of NLP models, current generation methods either rely on manual labor to create very few counterfactuals, or only instantiate limited types of perturbations such as paraphrases or word substitutions. We present Polyjuice, a general-purpose counterfactual generator that allows for control over perturbation types and locations, trained by finetuning GPT-2 on multiple datasets of paired sentences. We show that Polyjuice produces diverse sets of realistic counterfactuals, which in turn are useful in various distinct applications: improving training and evaluation on three different tasks (with around 70% less annotation effort than manual generation), augmenting state-of-the-art explanation techniques, and supporting systematic counterfactual error analysis by revealing behaviors easily missed by human experts.

Tongshuang Wu, Marco Tulio Ribeiro, Jeffrey Heer, Daniel S. Weld• 2021

Related benchmarks

TaskDatasetResultRank
Counterfactual GenerationSNLI Hypothesis
LFR35
37
Counterfactual GenerationSNLI Premise
LFR0.281
37
Counterfactual GenerationAG-News
LFR0.165
37
Counterfactual GenerationIMDB
LFR27.5
37
Counterfactual GenerationSST2 (test)
SLFR29
29
Counterfactual GenerationAG News (test)
SLFR18.6
29
Showing 6 of 6 rows

Other info

Code

Follow for update