Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Molding CNNs for text: non-linear, non-consecutive convolutions

About

The success of deep learning often derives from well-chosen operational building blocks. In this work, we revise the temporal convolution operation in CNNs to better adapt it to text processing. Instead of concatenating word representations, we appeal to tensor algebra and use low-rank n-gram tensors to directly exploit interactions between words already at the convolution stage. Moreover, we extend the n-gram convolution to non-consecutive words to recognize patterns with intervening words. Through a combination of low-rank tensors, and pattern weighting, we can efficiently evaluate the resulting convolution operation via dynamic programming. We test the resulting architecture on standard sentiment classification and news categorization tasks. Our model achieves state-of-the-art performance both in terms of accuracy and training speed. For instance, we obtain 51.2% accuracy on the fine-grained sentiment classification task.

Tao Lei, Regina Barzilay, Tommi Jaakkola• 2015

Related benchmarks

TaskDatasetResultRank
Sentiment AnalysisSST-5 (test)
Accuracy51.2
173
Text ClassificationSST-2
Accuracy88.6
121
Sentiment ClassificationStanford Sentiment Treebank SST-2 (test)
Accuracy88.6
99
Text ClassificationSST-1
Accuracy51.2
45
Sentence ClassificationStanford Sentiment Treebank (SST) fine-grained (test)
Accuracy51.2
40
Sentiment AnalysisStanford Sentiment Treebank (SST) (test)
Accuracy51.2
10
Showing 6 of 6 rows

Other info

Follow for update