Deep Graph Convolutional Encoders for Structured Data to Text Generation
About
Most previous work on neural text generation from graph-structured data relies on standard sequence-to-sequence methods. These approaches linearise the input graph to be fed to a recurrent neural network. In this paper, we propose an alternative encoder based on graph convolutional networks that directly exploits the input structure. We report results on two graph-to-sequence datasets that empirically show the benefits of explicitly encoding the input graph structure.
Diego Marcheggiani, Laura Perez-Beltrachini• 2018
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Data-to-text generation | WebNLG (test) | BLEU60.8 | 39 | |
| Text Generation | WebNLG seen categories (test) | BLEU55.9 | 18 | |
| Graph-to-text generation | WebNLG (test) | -- | 18 | |
| Entity Description Generation | ENT-DESC main results 1.0 | BLEU28.4 | 16 | |
| Data-to-text generation | WebNLG Seen v1 | BLEU55.9 | 9 | |
| Graph-to-Text | WebNLG v2.0 (test) | BLEU60.8 | 9 | |
| Surface Realization | SR11Deep (test) | BLEU0.666 | 6 |
Showing 7 of 7 rows