Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Grammar-based Neural Text-to-SQL Generation

About

The sequence-to-sequence paradigm employed by neural text-to-SQL models typically performs token-level decoding and does not consider generating SQL hierarchically from a grammar. Grammar-based decoding has shown significant improvements for other semantic parsing tasks, but SQL and other general programming languages have complexities not present in logical formalisms that make writing hierarchical grammars difficult. We introduce techniques to handle these complexities, showing how to construct a schema-dependent grammar with minimal over-generation. We analyze these techniques on ATIS and Spider, two challenging text-to-SQL datasets, demonstrating that they yield 14--18\% relative reductions in error.

Kevin Lin, Ben Bogin, Mark Neumann, Jonathan Berant, Matt Gardner• 2019

Related benchmarks

TaskDatasetResultRank
Text-to-SQLSpider (test)
Execution Accuracy33.8
140
Text-to-SQLSpider (dev)--
100
Semantic ParsingATIS (dev)
Query Acc39.1
10
Text-to-SQLATIS (test)
Query Accuracy44.1
7
Showing 4 of 4 rows

Other info

Follow for update