Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

ERNIE: Enhanced Representation through Knowledge Integration

About

We present a novel language representation model enhanced by knowledge called ERNIE (Enhanced Representation through kNowledge IntEgration). Inspired by the masking strategy of BERT, ERNIE is designed to learn language representation enhanced by knowledge masking strategies, which includes entity-level masking and phrase-level masking. Entity-level strategy masks entities which are usually composed of multiple words.Phrase-level strategy masks the whole phrase which is composed of several words standing together as a conceptual unit.Experimental results show that ERNIE outperforms other baseline methods, achieving new state-of-the-art results on five Chinese natural language processing tasks including natural language inference, semantic similarity, named entity recognition, sentiment analysis and question answering. We also demonstrate that ERNIE has more powerful knowledge inference capacity on a cloze test.

Yu Sun, Shuohuan Wang, Yukun Li, Shikun Feng, Xuyi Chen, Han Zhang, Xin Tian, Danxiang Zhu, Hao Tian, Hua Wu• 2019

Related benchmarks

TaskDatasetResultRank
Natural Language InferenceXNLI (test)
Average Accuracy78.6
167
Natural Language InferenceSNLI (dev)
Accuracy79.9
71
Named Entity RecognitionNER (test)
F1 Score95.1
68
Named Entity RecognitionOntoNotes 4.0 (test)
F1 Score80.38
55
Machine Reading ComprehensionDRCD (test)
EM84
45
Machine Reading ComprehensionDRCD (dev)
EM84.6
45
Single Sentence ClassificationTHUCNews (dev)
Accuracy97.6
36
Machine Reading ComprehensionCMRC 2018 (dev)
EM66.89
34
Sentiment AnalysisChnSentiCorp (dev)
Accuracy95.2
33
Sentiment AnalysisChnSentiCorp (test)
Accuracy95.4
33
Showing 10 of 50 rows

Other info

Code

Follow for update