Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

A Simple but Effective Pluggable Entity Lookup Table for Pre-trained Language Models

About

Pre-trained language models (PLMs) cannot well recall rich factual knowledge of entities exhibited in large-scale corpora, especially those rare entities. In this paper, we propose to build a simple but effective Pluggable Entity Lookup Table (PELT) on demand by aggregating the entity's output representations of multiple occurrences in the corpora. PELT can be compatibly plugged as inputs to infuse supplemental entity knowledge into PLMs. Compared to previous knowledge-enhanced PLMs, PELT only requires 0.2%-5% pre-computation with capability of acquiring knowledge from out-of-domain corpora for domain adaptation scenario. The experiments on knowledge-related tasks demonstrate that our method, PELT, can flexibly and effectively transfer entity knowledge from related corpora into PLMs with different architectures.

Deming Ye, Yankai Lin, Peng Li, Maosong Sun, Zhiyuan Liu• 2022

Related benchmarks

TaskDatasetResultRank
Relation ExtractionWiki80
Accuracy0.85
51
Few-shot Relation ClassificationFewRel 1.0 (test)--
36
Entity TypingWiki-ET
F1 Score77.9
24
Knowledge ProbingLAMA
Google-RE P@113.3
20
Relation ClassificationWiki80 (test)
Accuracy93.4
15
Knowledge ProbingLAMA-UHN
P@1 (Google-RE)8.9
14
Relation ClassificationFewRel 2.0 (test)
Accuracy (5-way 1-shot)75
5
Knowledge InfusionWikipedia (train)
Pre-computation Time (h)7
2
Showing 8 of 8 rows

Other info

Code

Follow for update