Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

RAAT: Relation-Augmented Attention Transformer for Relation Modeling in Document-Level Event Extraction

About

In document-level event extraction (DEE) task, event arguments always scatter across sentences (across-sentence issue) and multiple events may lie in one document (multi-event issue). In this paper, we argue that the relation information of event arguments is of great significance for addressing the above two issues, and propose a new DEE framework which can model the relation dependencies, called Relation-augmented Document-level Event Extraction (ReDEE). More specifically, this framework features a novel and tailored transformer, named as Relation-augmented Attention Transformer (RAAT). RAAT is scalable to capture multi-scale and multi-amount argument relations. To further leverage relation information, we introduce a separate event relation prediction task and adopt multi-task learning method to explicitly enhance event extraction performance. Extensive experiments demonstrate the effectiveness of the proposed method, which can achieve state-of-the-art performance on two public datasets. Our code is available at https://github. com/TencentYoutuResearch/RAAT.

Yuan Liang, Zhuoxuan Jiang, Di Yin, Bo Ren• 2022

Related benchmarks

TaskDatasetResultRank
Document-level Event ExtractionChFinAnn (test)
EF F1 Score74.1
25
Event extractionDuEE-fin (dev)
Precision77
4
Event extractionDuEE-fin online (test)
Precision69.2
4
Showing 3 of 3 rows

Other info

Code

Follow for update