Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Pretrained Encyclopedia: Weakly Supervised Knowledge-Pretrained Language Model

About

Recent breakthroughs of pretrained language models have shown the effectiveness of self-supervised learning for a wide range of natural language processing (NLP) tasks. In addition to standard syntactic and semantic NLP tasks, pretrained models achieve strong improvements on tasks that involve real-world knowledge, suggesting that large-scale language modeling could be an implicit method to capture knowledge. In this work, we further investigate the extent to which pretrained models such as BERT capture knowledge using a zero-shot fact completion task. Moreover, we propose a simple yet effective weakly supervised pretraining objective, which explicitly forces the model to incorporate knowledge about real-world entities. Models trained with our new objective yield significant improvements on the fact completion task. When applied to downstream tasks, our model consistently outperforms BERT on four entity-related question answering datasets (i.e., WebQuestions, TriviaQA, SearchQA and Quasar-T) with an average 2.7 F1 improvements and a standard fine-grained entity typing dataset (i.e., FIGER) with 5.7 accuracy gains.

Wenhan Xiong, Jingfei Du, William Yang Wang, Veselin Stoyanov• 2019

Related benchmarks

TaskDatasetResultRank
Fine-Grained Entity TypingFIGER (test)
Macro F181.99
22
Open-domain Question AnsweringSearchQA
EM61.7
13
Open-domain Question AnsweringQuasar-T
EM45.8
12
Showing 3 of 3 rows

Other info

Follow for update