Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Few-Shot Data-to-Text Generation via Unified Representation and Multi-Source Learning

About

We present a novel approach for structured data-to-text generation that addresses the limitations of existing methods that primarily focus on specific types of structured data. Our proposed method aims to improve performance in multi-task training, zero-shot and few-shot scenarios by providing a unified representation that can handle various forms of structured data such as tables, knowledge graph triples, and meaning representations. We demonstrate that our proposed approach can effectively adapt to new structured forms, and can improve performance in comparison to current methods. For example, our method resulted in a 66% improvement in zero-shot BLEU scores when transferring models trained on table inputs to a knowledge graph dataset. Our proposed method is an important step towards a more general data-to-text generation framework.

Alexander Hanbo Li, Mingyue Shang, Evangelia Spiliopoulou, Jie Ma, Patrick Ng, Zhiguo Wang, Bonan Min, William Wang, Kathleen McKeown, Vittorio Castelli, Dan Roth, Bing Xiang• 2023

Related benchmarks

TaskDatasetResultRank
Data-to-text generationDART (test)
BLEU50.2
42
Data-to-text generationE2E (test)
BLEU43.2
33
Data-to-text generationToTTo full (test)
BLEU50.8
12
Logical Data-to-Text GenerationLOGICNLG (test)
BLEU-325.4
10
Data-to-text generationDART KG
BLEU0.315
5
Data-to-text generationWebNLG KG
BLEU (Unigram)39.8
5
Data-to-text generationE2E clean MR
BLEU22.6
4
Table-to-text generationLogicNLG Table
BLEU-38.9
3
Showing 8 of 8 rows

Other info

Follow for update