Learning Dynamic Context Augmentation for Global Entity Linking
About
Despite of the recent success of collective entity linking (EL) methods, these "global" inference methods may yield sub-optimal results when the "all-mention coherence" assumption breaks, and often suffer from high computational cost at the inference stage, due to the complex search space. In this paper, we propose a simple yet effective solution, called Dynamic Context Augmentation (DCA), for collective EL, which requires only one pass through the mentions in a document. DCA sequentially accumulates context information to make efficient, collective inference, and can cope with different local EL models as a plug-and-enhance module. We explore both supervised and reinforcement learning strategies for learning the DCA model. Extensive experiments show the effectiveness of our model with different learning settings, base models, decision orders and attention mechanisms.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Entity Disambiguation | AIDA CoNLL (test) | In-KB Accuracy94.64 | 36 | |
| Entity Linking | Wiki (test) | Micro F178.84 | 27 | |
| Entity Linking | AQUAINT (test) | Micro F1 Score88.53 | 27 | |
| Entity Linking | ACE2004 (test) | Micro F1 Score0.9014 | 27 | |
| Entity Linking | CWEB (test) | Micro F175.59 | 26 | |
| Named Entity Disambiguation | AIDA (test) | Micro InKB F193.7 | 25 | |
| Entity Disambiguation | Wiki (test) | Micro F178.8 | 24 | |
| Entity Disambiguation | AIDA-CoNLL B (test) | -- | 21 | |
| Named Entity Disambiguation | MSNBC out-of-domain (test) | Micro F1 (InKB)93.8 | 18 | |
| Entity Disambiguation | Standard Entity Disambiguation Datasets (AIDA, MSNBC, AQUAINT, ACE2004, CWEB, WIKI) InKB (test) | AIDA Score93.7 | 15 |