Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Case-based Reasoning for Natural Language Queries over Knowledge Bases

About

It is often challenging to solve a complex problem from scratch, but much easier if we can access other similar problems with their solutions -- a paradigm known as case-based reasoning (CBR). We propose a neuro-symbolic CBR approach (CBR-KBQA) for question answering over large knowledge bases. CBR-KBQA consists of a nonparametric memory that stores cases (question and logical forms) and a parametric model that can generate a logical form for a new question by retrieving cases that are relevant to it. On several KBQA datasets that contain complex questions, CBR-KBQA achieves competitive performance. For example, on the ComplexWebQuestions dataset, CBR-KBQA outperforms the current state of the art by 11\% on accuracy. Furthermore, we show that CBR-KBQA is capable of using new cases \emph{without} any further training: by incorporating a few human-labeled examples in the case memory, CBR-KBQA is able to successfully generate logical forms containing unseen KB entities as well as relations.

Rajarshi Das, Manzil Zaheer, Dung Thai, Ameya Godbole, Ethan Perez, Jay-Yoon Lee, Lizhen Tan, Lazaros Polymenakos, Andrew McCallum• 2021

Related benchmarks

TaskDatasetResultRank
Knowledge Base Question AnsweringWEBQSP (test)--
143
Knowledge Graph Question AnsweringCWQ
Hit@167.14
105
Multi-hop Knowledge Graph Question AnsweringWebQSP
Hits@185.7
50
Multi-hop Knowledge Graph Question AnsweringCWQ
Hits@170.4
46
Knowledge Base Question AnsweringWebQSP Freebase (test)
F1 Score72.8
46
Knowledge Base Question AnsweringCWQ (test)
F1 Score70
42
Knowledge Base Question AnsweringWebQSP
Accuracy69.9
23
Multi-hop Knowledge Graph Question AnsweringGrailQA
Hits@175.4
21
Knowledge Base Question AnsweringCWQ Freebase (test)
Hits@170.4
19
Open-domain Question AnsweringWebQuestions
Hits@156.3
19
Showing 10 of 18 rows

Other info

Follow for update