Syntax-aware Neural Semantic Role Labeling with Supertags
About
We introduce a new syntax-aware model for dependency-based semantic role labeling that outperforms syntax-agnostic models for English and Spanish. We use a BiLSTM to tag the text with supertags extracted from dependency parses, and we feed these supertags, along with words and parts of speech, into a deep highway BiLSTM for semantic role labeling. Our model combines the strengths of earlier models that performed SRL on the basis of a full dependency parse with more recent models that use no syntactic information at all. Our local and non-ensemble model achieves state-of-the-art performance on the CoNLL 09 English and Spanish datasets. SRL models benefit from syntactic information, and we show that supertagging is a simple, powerful, and robust way to incorporate syntax into a neural SRL system.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Dependency Semantic Role Labeling | CoNLL 2009 (test) | F1 Score90.2 | 32 | |
| Dependency-based Semantic Role Labeling | CoNLL Brown 2009 (test) | Precision81 | 22 | |
| Dependency-based Semantic Role Labeling | CoNLL 2009 (Out-of-domain (Brown)) | F1 Score80.8 | 17 | |
| Argument Labeling | CoNLL dependency-based SRL 2009 (WSJ) | Precision90.3 | 10 |