Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

GammaE: Gamma Embeddings for Logical Queries on Knowledge Graphs

About

Embedding knowledge graphs (KGs) for multi-hop logical reasoning is a challenging problem due to massive and complicated structures in many KGs. Recently, many promising works projected entities and queries into a geometric space to efficiently find answers. However, it remains challenging to model the negation and union operator. The negation operator has no strict boundaries, which generates overlapped embeddings and leads to obtaining ambiguous answers. An additional limitation is that the union operator is non-closure, which undermines the model to handle a series of union operators. To address these problems, we propose a novel probabilistic embedding model, namely Gamma Embeddings (GammaE), for encoding entities and queries to answer different types of FOL queries on KGs. We utilize the linear property and strong boundary support of the Gamma distribution to capture more features of entities and queries, which dramatically reduces model uncertainty. Furthermore, GammaE implements the Gamma mixture method to design the closed union operator. The performance of GammaE is validated on three large logical query datasets. Experimental results show that GammaE significantly outperforms state-of-the-art models on public benchmarks.

Dong Yang, Peijun Qing, Yang Li, Haonan Lu, Xiaodong Lin• 2022

Related benchmarks

TaskDatasetResultRank
Logical Query AnsweringFB15k-237
MRR (2-inverse path)0.335
29
Knowledge Graph ReasoningFB15k-237--
19
Knowledge Graph ReasoningFB15k
1P Score76.5
12
Knowledge Graph ReasoningNELL
1P Score55.1
10
Showing 4 of 4 rows

Other info

Follow for update