Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

A Graph-to-Sequence Model for AMR-to-Text Generation

About

The problem of AMR-to-text generation is to recover a text representing the same meaning as an input AMR graph. The current state-of-the-art method uses a sequence-to-sequence model, leveraging LSTM for encoding a linearized AMR structure. Although being able to model non-local semantic information, a sequence LSTM can lose information from the AMR graph structure, and thus faces challenges with large graphs, which result in long sequences. We introduce a neural graph-to-sequence model, using a novel LSTM structure for directly encoding graph-level semantics. On a standard benchmark, our model shows superior results to existing methods in the literature.

Linfeng Song, Yue Zhang, Zhiguo Wang, Daniel Gildea• 2018

Related benchmarks

TaskDatasetResultRank
AMR GenerationLDC2015E86 (test)
BLEU33.6
37
Showing 1 of 1 rows

Other info

Code

Follow for update