Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Language Models are Open Knowledge Graphs

About

This paper shows how to construct knowledge graphs (KGs) from pre-trained language models (e.g., BERT, GPT-2/3), without human supervision. Popular KGs (e.g, Wikidata, NELL) are built in either a supervised or semi-supervised manner, requiring humans to create knowledge. Recent deep language models automatically acquire knowledge from large-scale corpora via pre-training. The stored knowledge has enabled the language models to improve downstream NLP tasks, e.g., answering questions, and writing code and articles. In this paper, we propose an unsupervised method to cast the knowledge contained within language models into KGs. We show that KGs are constructed with a single forward pass of the pre-trained language models (without fine-tuning) over the corpora. We demonstrate the quality of the constructed KGs by comparing to two KGs (Wikidata, TAC KBP) created by humans. Our KGs also provide open factual knowledge that is new in the existing KGs. Our code and KGs will be made publicly available.

Chenguang Wang, Xiao Liu, Dawn Song• 2020

Related benchmarks

TaskDatasetResultRank
Open Information ExtractionWEB
F154.3
18
Open Information ExtractionNYT
F1 Score32.9
18
Open Information ExtractionOIE 2016
F1 Score36.6
18
Open Information ExtractionPENN
F1 Score33
18
Showing 4 of 4 rows

Other info

Follow for update