Structural Embedding of Syntactic Trees for Machine Comprehension
About
Deep neural networks for machine comprehension typically utilizes only word or character embeddings without explicitly taking advantage of structured linguistic information such as constituency trees and dependency trees. In this paper, we propose structural embedding of syntactic trees (SEST), an algorithm framework to utilize structured information and encode them into vector representations that can boost the performance of algorithms for the machine comprehension. We evaluate our approach using a state-of-the-art neural attention model on the SQuAD dataset. Experimental results demonstrate that our model can accurately identify the syntactic boundaries of the sentences and extract answers that are syntactically coherent over the baseline methods.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Question Answering | SQuAD v1.1 (dev) | F1 Score84.1 | 375 | |
| Question Answering | SQuAD v1.1 (test) | F1 Score80.8 | 260 | |
| Question Answering | SQuAD (test) | F180.84 | 111 | |
| Question Answering | SQuAD hidden 1.1 (test) | EM68.2 | 18 | |
| Question Answering | AddOneSent (test) | EM40 | 15 | |
| Question Answering | adversarial SQuAD (test) | Add Sent Score33.9 | 12 | |
| Reading Comprehension | Adversarial SQuAD AddSent v1.1 (test) | F133.9 | 10 | |
| Reading Comprehension | Adversarial SQuAD AddOneSent v1.1 (test) | F1 Score44.8 | 10 | |
| Question Answering | AddSent | EM30 | 8 | |
| Machine Reading Comprehension | SQuAD AddSent | EM30 | 7 |