Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Improved Relation Extraction with Feature-Rich Compositional Embedding Models

About

Compositional embedding models build a representation (or embedding) for a linguistic structure based on its component word embeddings. We propose a Feature-rich Compositional Embedding Model (FCM) for relation extraction that is expressive, generalizes to new domains, and is easy-to-implement. The key idea is to combine both (unlexicalized) hand-crafted features with learned word embeddings. The model is able to directly tackle the difficulties met by traditional compositional embeddings models, such as handling arbitrary types of sentence annotations and utilizing global information for composition. We test the proposed model on two relation extraction tasks, and demonstrate that our model outperforms both previous compositional models and traditional feature rich models on the ACE 2005 relation extraction task, and the SemEval 2010 relation classification task. The combination of our model and a log-linear classifier with hand-crafted features gives state-of-the-art results.

Matthew R. Gormley, Mo Yu, Mark Dredze• 2015

Related benchmarks

TaskDatasetResultRank
Relation ClassificationSemEval-2010 Task 8 (test)
F1 Score83.4
128
Relation ExtractionNYT (test)
F1 Score24
85
Relation ExtractionACE05 (test)--
72
Relation ExtractionWiki-KBP (test)
F1 Score30.1
59
Relation ExtractionACE bc 2005 (test)
Precision74.39
22
Relation ExtractionACE out-of-domain cts 2005 (test)
Precision74.53
14
Relation ExtractionBioInfer (test)
Precision0.535
11
Relation ExtractionACE wl domain 2005 (test)
Precision65.63
10
Relation ClassificationNYT (test)
Accuracy68.8
10
Relation ClassificationWiki-KBP (test)
Accuracy61.7
10
Showing 10 of 17 rows

Other info

Code

Follow for update