Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Improving Grammar-based Sequence-to-Sequence Modeling with Decomposition and Constraints

About

Neural QCFG is a grammar-based sequence-tosequence (seq2seq) model with strong inductive biases on hierarchical structures. It excels in interpretability and generalization but suffers from expensive inference. In this paper, we study two low-rank variants of Neural QCFG for faster inference with different trade-offs between efficiency and expressiveness. Furthermore, utilizing the symbolic interface provided by the grammar, we introduce two soft constraints over tree hierarchy and source coverage. We experiment with various datasets and find that our models outperform vanilla Neural QCFG in most settings.

Chao Lou, Kewei Tu• 2023

Related benchmarks

TaskDatasetResultRank
Instruction FollowingSCAN jump
Accuracy97.08
18
Style TransferStylePTB ATP (Active to passive)
BLEU-475.44
11
Machine TranslationEn-Fr Machine Translation (small-scale)
BLEU-430.51
11
Command-to-action mappingSCAN (length)
Accuracy91.72
11
Sequence TransductionSCAN Simple
Accuracy95.27
3
Sequence TransductionSCAN A. Right
Accuracy97.63
3
Showing 6 of 6 rows

Other info

Code

Follow for update