Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Language to Logical Form with Neural Attention

About

Semantic parsing aims at mapping natural language to machine interpretable meaning representations. Traditional approaches rely on high-quality lexicons, manually-built templates, and linguistic features which are either domain- or representation-specific. In this paper we present a general method based on an attention-enhanced encoder-decoder model. We encode input utterances into vector representations, and generate their logical forms by conditioning the output sequences or trees on the encoding vectors. Experimental results on four datasets show that our approach performs competitively without using hand-engineered features and is easy to adapt across domains and meaning representations.

Li Dong, Mirella Lapata• 2016

Related benchmarks

TaskDatasetResultRank
Text-to-SQLSpider 1.0 (dev)
Exact Match Accuracy1.9
92
Text-to-SQLSpider 1.0 (test)
EM Acc (Overall)4.8
91
Semantic ParsingCFQ (MCD1)
Accuracy24.3
33
Semantic ParsingCFQ (MCD2)
Accuracy4.1
33
Semantic ParsingCFQ MCD3
Accuracy6.3
33
Code GenerationDjango (test)
Accuracy39.4
28
Semantic ParsingWikiSQL (test)
Execution Accuracy35.9
27
Semantic ParsingGEO
Accuracy0.871
26
Semantic ParsingATIS
Accuracy84.8
19
Semantic ParsingWikiTableQuestions (test)--
17
Showing 10 of 19 rows

Other info

Follow for update