Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Levi Graph AMR Parser using Heterogeneous Attention

About

Coupled with biaffine decoders, transformers have been effectively adapted to text-to-graph transduction and achieved state-of-the-art performance on AMR parsing. Many prior works, however, rely on the biaffine decoder for either or both arc and label predictions although most features used by the decoder may be learned by the transformer already. This paper presents a novel approach to AMR parsing by combining heterogeneous data (tokens, concepts, labels) as one input to a transformer to learn attention, and use only attention matrices from the transformer to predict all elements in AMR graphs (concepts, arcs, labels). Although our models use significantly fewer parameters than the previous state-of-the-art graph parser, they show similar or better accuracy on AMR 2.0 and 3.0.

Han He, Jinho D. Choi• 2021

Related benchmarks

TaskDatasetResultRank
AMR parsingLDC2017T10 AMR 2.0 (test)--
168
AMR parsingAMR 3.0 LDC2020T02 (test)
Smatch Labeled77
14
Showing 2 of 2 rows

Other info

Code

Follow for update