Tab-CoT: Zero-shot Tabular Chain of Thought
About
The chain-of-though (CoT) prompting methods were successful in various natural language processing (NLP) tasks thanks to their ability to unveil the underlying complex reasoning processes. Such reasoning processes typically exhibit implicitly structured steps. Recent efforts also started investigating methods to encourage more explicitly structured reasoning procedures to be captured. In this work, we propose Tab-CoT, a novel tabular-format CoT prompting method, which allows the complex reasoning process to be explicitly modelled in a highly structured manner. Despite its simplicity, we show that our approach is capable of performing reasoning across multiple dimensions (i.e., both rows and columns). We demonstrate our approach's strong zero-shot and few-shot capabilities through extensive experiments on a range of reasoning tasks.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Table Fact Verification | TabFact (test) | Accuracy79.05 | 136 | |
| Table Question Answering | WikiTQ (test) | Accuracy60.43 | 130 | |
| Table Fact Verification | TabFact | Accuracy0.798 | 104 | |
| Financial Question Answering | FinQA (test) | Accuracy27.81 | 57 | |
| Table Question Answering | WikiTQ | Accuracy60.6 | 29 | |
| Table Mathematical Reasoning | TabMWP (test) | Accuracy92.21 | 15 | |
| Hierarchical Table Question Answering | HiTab (test) | Accuracy (%)60.64 | 15 |