Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Structural Embedding of Syntactic Trees for Machine Comprehension

About

Deep neural networks for machine comprehension typically utilizes only word or character embeddings without explicitly taking advantage of structured linguistic information such as constituency trees and dependency trees. In this paper, we propose structural embedding of syntactic trees (SEST), an algorithm framework to utilize structured information and encode them into vector representations that can boost the performance of algorithms for the machine comprehension. We evaluate our approach using a state-of-the-art neural attention model on the SQuAD dataset. Experimental results demonstrate that our model can accurately identify the syntactic boundaries of the sentences and extract answers that are syntactically coherent over the baseline methods.

Rui Liu, Junjie Hu, Wei Wei, Zi Yang, Eric Nyberg• 2017

Related benchmarks

TaskDatasetResultRank
Question AnsweringSQuAD v1.1 (dev)
F1 Score84.1
375
Question AnsweringSQuAD v1.1 (test)
F1 Score80.8
260
Question AnsweringSQuAD (test)
F180.84
111
Question AnsweringSQuAD hidden 1.1 (test)
EM68.2
18
Question AnsweringAddOneSent (test)
EM40
15
Question Answeringadversarial SQuAD (test)
Add Sent Score33.9
12
Reading ComprehensionAdversarial SQuAD AddSent v1.1 (test)
F133.9
10
Reading ComprehensionAdversarial SQuAD AddOneSent v1.1 (test)
F1 Score44.8
10
Question AnsweringAddSent
EM30
8
Machine Reading ComprehensionSQuAD AddSent
EM30
7
Showing 10 of 11 rows

Other info

Follow for update