Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

GETT-QA: Graph Embedding based T2T Transformer for Knowledge Graph Question Answering

About

In this work, we present an end-to-end Knowledge Graph Question Answering (KGQA) system named GETT-QA. GETT-QA uses T5, a popular text-to-text pre-trained language model. The model takes a question in natural language as input and produces a simpler form of the intended SPARQL query. In the simpler form, the model does not directly produce entity and relation IDs. Instead, it produces corresponding entity and relation labels. The labels are grounded to KG entity and relation IDs in a subsequent step. To further improve the results, we instruct the model to produce a truncated version of the KG embedding for each entity. The truncated KG embedding enables a finer search for disambiguation purposes. We find that T5 is able to learn the truncated KG embeddings without any change of loss function, improving KGQA performance. As a result, we report strong results for LC-QuAD 2.0 and SimpleQuestions-Wikidata datasets on end-to-end KGQA over Wikidata.

Debayan Banerjee, Pranav Ajit Nair, Ricardo Usbeck, Chris Biemann• 2023

Related benchmarks

TaskDatasetResultRank
Knowledge Base Question AnsweringSimpleQuestions Wiki (test)
F176.1
7
Knowledge Graph Question AnsweringLC-QuAD 2.0 Wikidata dump 20 April 2021 NLIWOD (test)--
5
Knowledge Graph Question AnsweringSimpleQuestions-Wikidata (Wiki4M)--
2
Showing 3 of 3 rows

Other info

Code

Follow for update