Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

ERNIE: Enhanced Language Representation with Informative Entities

About

Neural language representation models such as BERT pre-trained on large-scale corpora can well capture rich semantic patterns from plain text, and be fine-tuned to consistently improve the performance of various NLP tasks. However, the existing pre-trained language models rarely consider incorporating knowledge graphs (KGs), which can provide rich structured knowledge facts for better language understanding. We argue that informative entities in KGs can enhance language representation with external knowledge. In this paper, we utilize both large-scale textual corpora and KGs to train an enhanced language representation model (ERNIE), which can take full advantage of lexical, syntactic, and knowledge information simultaneously. The experimental results have demonstrated that ERNIE achieves significant improvements on various knowledge-driven tasks, and meanwhile is comparable with the state-of-the-art model BERT on other common NLP tasks. The source code of this paper can be obtained from https://github.com/thunlp/ERNIE.

Zhengyan Zhang, Xu Han, Zhiyuan Liu, Xin Jiang, Maosong Sun, Qun Liu• 2019

Related benchmarks

TaskDatasetResultRank
Commonsense ReasoningPIQA
Accuracy66.47
751
Common Sense ReasoningCOPA
Accuracy68.9
197
Relation ExtractionTACRED (test)
F1 Score70.7
194
Commonsense ReasoningOBQA
Accuracy58.9
117
Commonsense ReasoningSocialIQA
Accuracy65.1
116
Relation ExtractionTACRED
Micro F167.97
97
Abductive Commonsense ReasoningANLI (test)
Accuracy63.04
53
Commonsense ReasoningCommonsenseQA (CSQA) v1.0 (test)
Accuracy54.06
46
Few-shot Relation ClassificationFewRel 1.0 (test)--
36
Commonsense ReasoningaNLI
Accuracy63.04
35
Showing 10 of 31 rows

Other info

Code

Follow for update