Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Data-to-text Generation with Entity Modeling

About

Recent approaches to data-to-text generation have shown great promise thanks to the use of large-scale datasets and the application of neural network architectures which are trained end-to-end. These models rely on representation learning to select content appropriately, structure it coherently, and verbalize it grammatically, treating entities as nothing more than vocabulary tokens. In this work we propose an entity-centric neural architecture for data-to-text generation. Our model creates entity-specific representations which are dynamically updated. Text is generated conditioned on the data input and entity memory representations using hierarchical attention at each time step. We present experiments on the RotoWire benchmark and a (five times larger) new dataset on the baseball domain which we create. Our results show that the proposed model outperforms competitive baselines in automatic and human evaluation.

Ratish Puduppully, Li Dong, Mirella Lapata• 2019

Related benchmarks

TaskDatasetResultRank
Data-to-text generationMLB (test)
RG Precision81.1
22
Data-to-text generationRotoWire (test)
Factual Support Score4.77
19
Data-to-text generationROTOWIRE (dev)
RG Score0.3184
12
Data-to-text generationROTOWIRE English (test)
RG Score32.7
12
Knowledge SelectionRotoWire-FG
Relation Generation P98.89
10
Data-to-text generationGerman ROTOWIRE (DE-RW) (test)
RG Score17.4
8
Data-to-text generationMLB (dev)
RG Score21.32
4
Showing 7 of 7 rows

Other info

Code

Follow for update