Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Cached Long Short-Term Memory Neural Networks for Document-Level Sentiment Classification

About

Recently, neural networks have achieved great success on sentiment classification due to their ability to alleviate feature engineering. However, one of the remaining challenges is to model long texts in document-level sentiment classification under a recurrent architecture because of the deficiency of the memory unit. To address this problem, we present a Cached Long Short-Term Memory neural networks (CLSTM) to capture the overall semantic information in long texts. CLSTM introduces a cache mechanism, which divides memory into several groups with different forgetting rates and thus enables the network to keep sentiment information better within a recurrent unit. The proposed CLSTM outperforms the state-of-the-art models on three publicly available document-level sentiment analysis datasets.

Jiacheng Xu, Danlu Chen, Xipeng Qiu, Xuangjing Huang• 2016

Related benchmarks

TaskDatasetResultRank
Text ClassificationIMDB (test)
CA42.1
79
Review Sentiment ClassificationYelp 2014 (test)
Accuracy63.7
41
Sentiment AnalysisYelp '13 (test)
Accuracy59.4
33
Sentiment ClassificationYelp original 2013 (test)
RMSE0.729
23
Showing 4 of 4 rows

Other info

Follow for update