Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Convolutional Neural Networks with Recurrent Neural Filters

About

We introduce a class of convolutional neural networks (CNNs) that utilize recurrent neural networks (RNNs) as convolution filters. A convolution filter is typically implemented as a linear affine transformation followed by a non-linear function, which fails to account for language compositionality. As a result, it limits the use of high-order filters that are often warranted for natural language processing tasks. In this work, we model convolution filters with RNNs that naturally capture compositionality and long-term dependencies in language. We show that simple CNN architectures equipped with recurrent neural filters (RNFs) achieve results that are on par with the best published ones on the Stanford Sentiment Treebank and two answer sentence selection datasets.

Yi Yang• 2018

Related benchmarks

TaskDatasetResultRank
Answer SelectionWikiQA (test)
MAP0.729
149
Sentiment ClassificationStanford Sentiment Treebank SST-2 (test)
Accuracy90
99
Fine-grained Sentiment ClassificationStanford Sentiment Treebank (SST) (test)
Accuracy (Fine-grained)53.4
17
Answer Sentence SelectionQASent (test)
MAP78
8
Showing 4 of 4 rows

Other info

Code

Follow for update