Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

GPT-too: A language-model-first approach for AMR-to-text generation

About

Meaning Representations (AMRs) are broad-coverage sentence-level semantic graphs. Existing approaches to generating text from AMR have focused on training sequence-to-sequence or graph-to-sequence models on AMR annotated data only. In this paper, we propose an alternative approach that combines a strong pre-trained language model with cycle consistency-based re-scoring. Despite the simplicity of the approach, our experimental results show these models outperform all previous techniques on the English LDC2017T10dataset, including the recent use of transformer architectures. In addition to the standard evaluation metrics, we provide human evaluation experiments that further substantiate the strength of our approach.

Manuel Mager, Ramon Fernandez Astudillo, Tahira Naseem, Md Arafat Sultan, Young-Suk Lee, Radu Florian, Salim Roukos• 2020

Related benchmarks

TaskDatasetResultRank
AMR-to-text generationLDC2017T10 (test)
BLEU33.02
55
AMR-to-textLDC2017T10 AMR17 (test)
chrF++63.89
17
AMR-to-text generationAMR 2.0 (test)
BLEU33
10
Graph-to-text generationAMR (Human Evaluation)
Fluency5.69
5
Showing 4 of 4 rows

Other info

Follow for update