Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Table-To-Text generation and pre-training with TabT5

About

Encoder-only transformer models have been successfully applied to different table understanding tasks, as in TAPAS (Herzig et al., 2020). A major limitation of these architectures is that they are constrained to classification-like tasks such as cell selection or entailment detection. We present TABT5, an encoder-decoder model that generates natural language text based on tables and textual inputs. TABT5 overcomes the encoder-only limitation by incorporating a decoder component and leverages the input structure with table specific embeddings and pre-training. TABT5 achieves new state-of-the-art results on several domains, including spreadsheet formula prediction with a 15% increase in sequence accuracy, QA with a 2.5% increase in sequence accuracy and data-to-text generation with a 2.5% increase in BLEU.

Ewa Andrejczuk, Julian Martin Eisenschlos, Francesco Piccinno, Syrine Krichene, Yasemin Altun• 2022

Related benchmarks

TaskDatasetResultRank
Numerical Question AnsweringFinQA 1.0 (test)
Execution Accuracy70.79
14
Showing 1 of 1 rows

Other info

Follow for update