Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Syntax-Aware Graph-to-Graph Transformer for Semantic Role Labelling

About

Recent models have shown that incorporating syntactic knowledge into the semantic role labelling (SRL) task leads to a significant improvement. In this paper, we propose Syntax-aware Graph-to-Graph Transformer (SynG2G-Tr) model, which encodes the syntactic structure using a novel way to input graph relations as embeddings, directly into the self-attention mechanism of Transformer. This approach adds a soft bias towards attention patterns that follow the syntactic structure but also allows the model to use this information to learn alternative patterns. We evaluate our model on both span-based and dependency-based SRL datasets, and outperform previous alternative methods in both in-domain and out-of-domain settings, on CoNLL 2005 and CoNLL 2009 datasets.

Alireza Mohammadshahi, James Henderson• 2021

Related benchmarks

TaskDatasetResultRank
Span-based Semantic Role LabelingCoNLL 2005 (Out-of-domain (Brown))
F1 Score83.21
41
Semantic Role LabelingCoNLL 2005 (WSJ)
F1 Score88.9
41
Dependency Semantic Role LabelingCoNLL 2009 (test)
F1 Score93.03
32
Semantic Role LabelingCoNLL 2005 (WSJ (in-domain))
F1 Score88.93
24
Dependency-based Semantic Role LabelingCoNLL Brown 2009 (test)
Precision88.27
22
Showing 5 of 5 rows

Other info

Code

Follow for update