Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Content Enhanced BERT-based Text-to-SQL Generation

About

We present a simple methods to leverage the table content for the BERT-based model to solve the text-to-SQL problem. Based on the observation that some of the table content match some words in question string and some of the table header also match some words in question string, we encode two addition feature vector for the deep model. Our methods also benefit the model inference in testing time as the tables are almost the same in training and testing time. We test our model on the WikiSQL dataset and outperform the BERT-based baseline by 3.7% in logic form and 3.7% in execution accuracy and achieve state-of-the-art.

Tong Guo, Huilin Gao• 2019

Related benchmarks

TaskDatasetResultRank
Text-to-SQLWikiSQL Fully-supervised (dev)
Execution Accuracy91.2
12
Text-to-SQLWikiSQL Fully-supervised (test)
Execution Accuracy90.6
12
Text-to-SQLWikiSQL original (dev)
Exact Match (EM)84.3
9
Text-to-SQLWikiSQL ADVETA-RPL
Exact Match (EM)52.2
9
Text-to-SQLWikiSQL ADVETA-ADD
Exact Match (EM)71.2
9
Text-to-SQLWikiSQL (dev)
Logic Form Acc85.4
8
Text-to-SQLWikiSQL (test)
Logic Form Accuracy84.5
8
Showing 7 of 7 rows

Other info

Code

Follow for update