Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Highly Parallel Autoregressive Entity Linking with Discriminative Correction

About

Generative approaches have been recently shown to be effective for both Entity Disambiguation and Entity Linking (i.e., joint mention detection and disambiguation). However, the previously proposed autoregressive formulation for EL suffers from i) high computational cost due to a complex (deep) decoder, ii) non-parallelizable decoding that scales with the source sequence length, and iii) the need for training on a large amount of data. In this work, we propose a very efficient approach that parallelizes autoregressive linking across all potential mentions and relies on a shallow and efficient decoder. Moreover, we augment the generative objective with an extra discriminative component, i.e., a correction term which lets us directly optimize the generator's ranking. When taken together, these techniques tackle all the above issues: our model is >70 times faster and more accurate than the previous generative method, outperforming state-of-the-art approaches on the standard English dataset AIDA-CoNLL. Source code available at https://github.com/nicola-decao/efficient-autoregressive-EL

Nicola De Cao, Wilker Aziz, Ivan Titov• 2021

Related benchmarks

TaskDatasetResultRank
Entity LinkingAIDA (testb)
Micro F185.5
28
Entity LinkingAIDA (testa)
Micro F190.1
23
Entity LinkingAIDA-CoNLL Wikipedia 2019 (test)
Micro-F185.5
18
Named Entity DisambiguationMSNBC out-of-domain (test)
Micro F1 (InKB)19.8
18
Entity LinkingGERBIL
InKB Micro F1 (AIDA-B)85.5
15
Entity LinkingAIDA and Out-of-domain (MSNBC, Derczynski, KORE50, N3-Reuters-128, N3-RSS-500, OKE-15, OKE-16) (test)
AIDA Performance85.5
12
Entity LinkingOKE 16 (out-of-domain)
InKB micro F115.2
11
Entity LinkingKORE50 (out-of-domain)
InKB micro F10.082
11
Entity LinkingN3-Reuters-128 (out-of-domain)
InKB micro F122.7
11
Entity LinkingN3-RSS-500 (out-of-domain)
InKB micro F18.3
11
Showing 10 of 14 rows

Other info

Code

Follow for update