Answering Complex Queries in Knowledge Graphs with Bidirectional Sequence Encoders
About
Representation learning for knowledge graphs (KGs) has focused on the problem of answering simple link prediction queries. In this work we address the more ambitious challenge of predicting the answers of conjunctive queries with multiple missing entities. We propose Bi-Directional Query Embedding (BIQE), a method that embeds conjunctive queries with models based on bi-directional attention mechanisms. Contrary to prior work, bidirectional self-attention can capture interactions among all the elements of a query graph. We introduce a new dataset for predicting the answer of conjunctive query and conduct experiments that show BIQE significantly outperforming state of the art baselines.
Bhushan Kotnis, Carolin Lawrence, Mathias Niepert• 2020
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Logical reasoning | NELL995 transductive (test) | Avg H@330.9 | 5 | |
| Logical reasoning | FB15K-237 transductive (test) | Avg Hits@326.2 | 5 | |
| Logical reasoning | FB15k transductive (test) | Avg H@348.2 | 5 | |
| Inductive logical reasoning | NELL V3 (test) | Avg Success Rate0.089 | 4 | |
| Inductive logical reasoning | FB15k-237 V2 (test) | Avg Score15.8 | 4 | |
| Complex Logical Reasoning | NELL995 Inductive cross-KG (test) | H@10 (Avg)0.042 | 2 | |
| Complex Logical Reasoning | FB15k Inductive cross-KG (test) | Hits@10 (Average)8.2 | 2 |
Showing 7 of 7 rows