Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

A BERT-based Distractor Generation Scheme with Multi-tasking and Negative Answer Training Strategies

About

In this paper, we investigate the following two limitations for the existing distractor generation (DG) methods. First, the quality of the existing DG methods are still far from practical use. There is still room for DG quality improvement. Second, the existing DG designs are mainly for single distractor generation. However, for practical MCQ preparation, multiple distractors are desired. Aiming at these goals, in this paper, we present a new distractor generation scheme with multi-tasking and negative answer training strategies for effectively generating \textit{multiple} distractors. The experimental results show that (1) our model advances the state-of-the-art result from 28.65 to 39.81 (BLEU 1 score) and (2) the generated multiple distractors are diverse and show strong distracting power for multiple choice question.

Ho-Lam Chung, Ying-Hong Chan, Yao-Chung Fan• 2020

Related benchmarks

TaskDatasetResultRank
Distractor GenerationRACE
Accuracy74.34
7
Distractor GenerationRACE (test)
BLEU-139.81
6
Showing 2 of 2 rows

Other info

Code

Follow for update