Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Giving BERT a Calculator: Finding Operations and Arguments with Reading Comprehension

About

Reading comprehension models have been successfully applied to extractive text answers, but it is unclear how best to generalize these models to abstractive numerical answers. We enable a BERT-based reading comprehension model to perform lightweight numerical reasoning. We augment the model with a predefined set of executable 'programs' which encompass simple arithmetic as well as extraction. Rather than having to learn to manipulate numbers directly, the model can pick a program and execute it. On the recent Discrete Reasoning Over Passages (DROP) dataset, designed to challenge reading comprehension models, we show a 33% absolute improvement by adding shallow programs. The model can learn to predict new operations when appropriate in a math word problem setting (Roy and Roth, 2015) with very few training examples.

Daniel Andor, Luheng He, Kenton Lee, Emily Pitler• 2019

Related benchmarks

TaskDatasetResultRank
Reading ComprehensionDROP (dev)
F1 Score83.98
63
Reading ComprehensionDROP (test)
F1 Score83.56
61
Reading ComprehensionDROP 1.0 (test)
EM78.14
11
Math Word Problem SolvingIllinois (IL) 562 single-step word problems (5-fold cross-validation)
Accuracy83.2
10
Reading ComprehensionDROP Single-Span questions (dev)
EM79.8
10
Reading ComprehensionDROP Multi-Span span questions (dev)
EM6.2
10
Reading ComprehensionDROP v1.0 (dev)
EM78.97
8
Showing 7 of 7 rows

Other info

Follow for update