Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Advancing Abductive Reasoning in Knowledge Graphs through Complex Logical Hypothesis Generation

About

Abductive reasoning is the process of making educated guesses to provide explanations for observations. Although many applications require the use of knowledge for explanations, the utilization of abductive reasoning in conjunction with structured knowledge, such as a knowledge graph, remains largely unexplored. To fill this gap, this paper introduces the task of complex logical hypothesis generation, as an initial step towards abductive logical reasoning with KG. In this task, we aim to generate a complex logical hypothesis so that it can explain a set of observations. We find that the supervised trained generative model can generate logical hypotheses that are structurally closer to the reference hypothesis. However, when generalized to unseen observations, this training objective does not guarantee better hypothesis generation. To address this, we introduce the Reinforcement Learning from Knowledge Graph (RLF-KG) method, which minimizes differences between observations and conclusions drawn from generated hypotheses according to the KG. Experiments show that, with RLF-KG's assistance, the generated hypotheses provide better explanations, and achieve state-of-the-art results on three widely used KGs.

Jiaxin Bai, Yicheng Wang, Tianshi Zheng, Yue Guo, Xin Liu, Yangqiu Song• 2023

Related benchmarks

TaskDatasetResultRank
Abductive ReasoningFB15k-237 (test)
1p85.5
4
Abductive ReasoningWN18RR (test)
1p Score0.85
4
Abductive ReasoningDBpedia50 (test)
1p Success0.842
4
Abductive Knowledge Graph ReasoningFB15k-237 (test)
Jaccard Similarity0.666
2
Abductive Knowledge Graph ReasoningDBpedia50 (test)
Jaccard Similarity0.731
2
Knowledge Graph ReasoningWN18RR (test)--
2
Showing 6 of 6 rows

Other info

Follow for update