Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Answering Complex Queries in Knowledge Graphs with Bidirectional Sequence Encoders

About

Representation learning for knowledge graphs (KGs) has focused on the problem of answering simple link prediction queries. In this work we address the more ambitious challenge of predicting the answers of conjunctive queries with multiple missing entities. We propose Bi-Directional Query Embedding (BIQE), a method that embeds conjunctive queries with models based on bi-directional attention mechanisms. Contrary to prior work, bidirectional self-attention can capture interactions among all the elements of a query graph. We introduce a new dataset for predicting the answer of conjunctive query and conduct experiments that show BIQE significantly outperforming state of the art baselines.

Bhushan Kotnis, Carolin Lawrence, Mathias Niepert• 2020

Related benchmarks

TaskDatasetResultRank
Logical reasoningNELL995 transductive (test)
Avg H@330.9
5
Logical reasoningFB15K-237 transductive (test)
Avg Hits@326.2
5
Logical reasoningFB15k transductive (test)
Avg H@348.2
5
Inductive logical reasoningNELL V3 (test)
Avg Success Rate0.089
4
Inductive logical reasoningFB15k-237 V2 (test)
Avg Score15.8
4
Complex Logical ReasoningNELL995 Inductive cross-KG (test)
H@10 (Avg)0.042
2
Complex Logical ReasoningFB15k Inductive cross-KG (test)
Hits@10 (Average)8.2
2
Showing 7 of 7 rows

Other info

Follow for update