Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Graph-to-Sequence Learning using Gated Graph Neural Networks

About

Many NLP applications can be framed as a graph-to-sequence learning problem. Previous work proposing neural architectures on this setting obtained promising results compared to grammar-based approaches but still rely on linearisation heuristics and/or standard recurrent networks to achieve the best performance. In this work, we propose a new model that encodes the full structural information contained in the graph. Our architecture couples the recently proposed Gated Graph Neural Networks with an input transformation that allows nodes and edges to have their own hidden representations, while tackling the parameter explosion problem present in previous work. Experimental results show that our model outperforms strong baselines in generation from AMR graphs and syntax-based neural machine translation.

Daniel Beck, Gholamreza Haffari, Trevor Cohn• 2018

Related benchmarks

TaskDatasetResultRank
AMR-to-text generationLDC2017T10 (test)
BLEU23.3
55
Entity Description GenerationENT-DESC main results 1.0
BLEU24.4
16
Showing 2 of 2 rows

Other info

Follow for update