Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks

About

Because of their superior ability to preserve sequence information over time, Long Short-Term Memory (LSTM) networks, a type of recurrent neural network with a more complex computational unit, have obtained strong results on a variety of sequence modeling tasks. The only underlying LSTM structure that has been explored so far is a linear chain. However, natural language exhibits syntactic properties that would naturally combine words to phrases. We introduce the Tree-LSTM, a generalization of LSTMs to tree-structured network topologies. Tree-LSTMs outperform all existing systems and strong LSTM baselines on two tasks: predicting the semantic relatedness of two sentences (SemEval 2014, Task 1) and sentiment classification (Stanford Sentiment Treebank).

Kai Sheng Tai, Richard Socher, Christopher D. Manning• 2015

Related benchmarks

TaskDatasetResultRank
Node ClassificationPubmed--
742
Subjectivity ClassificationSubj
Accuracy92.1
266
Text ClassificationAG News (test)
Accuracy86.06
210
Question ClassificationTREC
Accuracy93
205
Relation ExtractionTACRED (test)
F1 Score62.4
194
Text ClassificationTREC
Accuracy91.8
179
Sentiment ClassificationSST-2
Accuracy88
174
Natural Language InferenceSNLI
Accuracy80.6
174
Sentiment AnalysisSST-5 (test)
Accuracy51
173
Sentiment ClassificationMR
Accuracy80
148
Showing 10 of 54 rows

Other info

Code

Follow for update