Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Improving Entity Linking by Modeling Latent Entity Type Information

About

Existing state of the art neural entity linking models employ attention-based bag-of-words context model and pre-trained entity embeddings bootstrapped from word embeddings to assess topic level context compatibility. However, the latent entity type information in the immediate context of the mention is neglected, which causes the models often link mentions to incorrect entities with incorrect type. To tackle this problem, we propose to inject latent entity type information into the entity embeddings based on pre-trained BERT. In addition, we integrate a BERT-based entity similarity score into the local context model of a state-of-the-art model to better capture latent entity type information. Our model significantly outperforms the state-of-the-art entity linking models on standard benchmark (AIDA-CoNLL). Detailed experiment analysis demonstrates that our model corrects most of the type errors produced by the direct baseline.

Shuang Chen, Jinpeng Wang, Feng Jiang, Chin-Yew Lin• 2020

Related benchmarks

TaskDatasetResultRank
Entity DisambiguationAIDA CoNLL (test)
In-KB Accuracy93.54
36
Entity LinkingAQUAINT (test)
Micro F1 Score89.8
27
Entity LinkingACE2004 (test)
Micro F1 Score88.9
27
Entity LinkingWiki (test)
Micro F180.1
27
Entity LinkingCWEB (test)--
26
Named Entity DisambiguationAIDA (test)
Micro InKB F193.5
25
Entity LinkingMSNBC (test)
F1 Score93.4
14
Showing 7 of 7 rows

Other info

Follow for update