Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Neural Compositional Rule Learning for Knowledge Graph Reasoning

About

Learning logical rules is critical to improving reasoning in KGs. This is due to their ability to provide logical and interpretable explanations when used for predictions, as well as their ability to generalize to other tasks, domains, and data. While recent methods have been proposed to learn logical rules, the majority of these methods are either restricted by their computational complexity and can not handle the large search space of large-scale KGs, or show poor generalization when exposed to data outside the training set. In this paper, we propose an end-to-end neural model for learning compositional logical rules called NCRL. NCRL detects the best compositional structure of a rule body, and breaks it into small compositions in order to infer the rule head. By recurrently merging compositions in the rule body with a recurrent attention unit, NCRL finally predicts a single rule head. Experimental results show that NCRL learns high-quality rules, as well as being generalizable. Specifically, we show that NCRL is scalable, efficient, and yields state-of-the-art results for knowledge graph completion on large-scale KGs. Moreover, we test NCRL for systematic generalization by learning to reason on small-scale observed graphs and evaluating on larger unseen ones.

Kewei Cheng, Nesreen K. Ahmed, Yizhou Sun• 2023

Related benchmarks

TaskDatasetResultRank
Knowledge Graph ReasoningWN18RR
MRR67
19
Knowledge Graph ReasoningKinship (test)
MRR0.64
19
Knowledge Graph ReasoningFB15k-237
MRR30
19
Knowledge Graph ReasoningUMLS (test)
MRR0.78
17
Knowledge Graph ReasoningFamily (test)
MRR91
16
Knowledge Graph ReasoningYAGO3-10
MRR0.38
14
Showing 6 of 6 rows

Other info

Follow for update