ReasTAP: Injecting Table Reasoning Skills During Pre-training via Synthetic Reasoning Examples
About
Reasoning over tabular data requires both table structure understanding and a broad set of table reasoning skills. Current models with table-specific architectures and pre-training methods perform well on understanding table structures, but they still struggle with tasks that require various table reasoning skills. In this work, we develop ReasTAP to show that high-level table reasoning skills can be injected into models during pre-training without a complex table-specific architecture design. We define 7 table reasoning skills, such as numerical operation, temporal comparison, and conjunction. Each reasoning skill is associated with one example generator, which synthesizes questions over semi-structured tables according to the sampled templates. We model the table pre-training task as a sequence generation task and pre-train ReasTAP to generate precise answers to the synthetic examples. ReasTAP is evaluated on four benchmarks covering three downstream tasks including: 1) WikiSQL and WTQ for Table Question Answering; 2) TabFact for Table Fact Verification; and 3) LogicNLG for Faithful Table-to-Text Generation. Experimental results demonstrate that ReasTAP achieves new state-of-the-art performance on all benchmarks and delivers a significant improvement on low-resource setting. Our code is publicly available at https://github.com/Yale-LILY/ReasTAP.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Table Question Answering | WTQ | Accuracy9.96 | 101 | |
| Table Fact Verification | TabFact (test) | Accuracy84.7 | 98 | |
| Table Question Answering | WikiTQ (test) | Accuracy58.6 | 92 | |
| Table Question Answering | WikiTableQuestions (test) | Accuracy58.6 | 86 | |
| Table Question Answering | HiTab | Accuracy20.54 | 67 | |
| Table Fact Verification | TabFact small (test) | Accuracy0.862 | 57 | |
| Table Question Answering | WikiSQL (test) | Accuracy88.8 | 55 | |
| Table Question Answering | TabMWP | Accuracy19.54 | 53 | |
| Table Question Answering | AIT-QA | Accuracy48.49 | 41 | |
| Table Fact Verification | TABFACT simple (test) | Accuracy94.1 | 39 |