Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Neural Question Generation from Text: A Preliminary Study

About

Automatic question generation aims to generate questions from a text passage where the generated questions can be answered by certain sub-spans of the given passage. Traditional methods mainly use rigid heuristic rules to transform a sentence into related questions. In this work, we propose to apply the neural encoder-decoder model to generate meaningful and diverse questions from natural language sentences. The encoder reads the input text and the answer position, to produce an answer-aware input representation, which is fed to the decoder to generate an answer focused question. We conduct a preliminary study on neural question generation from text with the SQuAD dataset, and the experiment results show that our method can produce fluent and diverse questions.

Qingyu Zhou, Nan Yang, Furu Wei, Chuanqi Tan, Hangbo Bao, Ming Zhou• 2017

Related benchmarks

TaskDatasetResultRank
Question GenerationSQuAD 1.1 (test)
BLEU-413.29
29
Question GenerationSQuAD (test)
BLEU-113.27
22
Question GenerationSQuAD 1.1 (dev)
BLEU-413.27
16
Question GenerationFairytaleQA (test)
Q-B40.503
6
Question GenerationSQuAD Human Evaluation Subset (test)
Avg Score2.18
2
Showing 5 of 5 rows

Other info

Code

Follow for update