Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Simple BERT Models for Relation Extraction and Semantic Role Labeling

About

We present simple BERT-based models for relation extraction and semantic role labeling. In recent years, state-of-the-art performance has been achieved using neural models by incorporating lexical and syntactic features such as part-of-speech tags and dependency trees. In this paper, extensive experiments on datasets for these two tasks show that without using any external features, a simple BERT-based model can achieve state-of-the-art performance. To our knowledge, we are the first to successfully apply BERT in this manner. Our models provide strong baselines for future research.

Peng Shi, Jimmy Lin• 2019

Related benchmarks

TaskDatasetResultRank
Relation ExtractionTACRED (test)
F1 Score67.8
194
Span-based Semantic Role LabelingCoNLL 2005 (Out-of-domain (Brown))
F1 Score82
41
Semantic Role LabelingCoNLL 2005 (WSJ)
F1 Score88.8
41
Semantic Role LabelingCoNLL 2005 (Brown)
F1 Score82
31
Span Semantic Role LabelingCoNLL-2012 (OntoNotes) v5.0 (test)
F1 Score86.5
25
Argument ClassificationWikiEvents (test)
Head F154.48
23
Semantic Role LabelingCoNLL 2012
F1 Score86.5
21
Argument IdentificationWikiEvents (test)
Head F169.83
20
Document-level Event Argument ExtractionRAMS (dev)
Span F139.2
18
Dependency-based Semantic Role LabelingCoNLL 2009 (Out-of-domain (Brown))
F1 Score85.7
17
Showing 10 of 28 rows

Other info

Follow for update