Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Revisiting LSTM Networks for Semi-Supervised Text Classification via Mixed Objective Function

About

In this paper, we study bidirectional LSTM network for the task of text classification using both supervised and semi-supervised approaches. Several prior works have suggested that either complex pretraining schemes using unsupervised methods such as language modeling (Dai and Le 2015; Miyato, Dai, and Goodfellow 2016) or complicated models (Johnson and Zhang 2017) are necessary to achieve a high classification accuracy. However, we develop a training strategy that allows even a simple BiLSTM model, when trained with cross-entropy loss, to achieve competitive results compared with more complex approaches. Furthermore, in addition to cross-entropy loss, by using a combination of entropy minimization, adversarial, and virtual adversarial losses for both labeled and unlabeled data, we report state-of-the-art results for text classification task on several benchmark datasets. In particular, on the ACL-IMDB sentiment analysis and AG-News topic classification datasets, our method outperforms current approaches by a substantial margin. We also show the generality of the mixed objective function by improving the performance on relation extraction task.

Devendra Singh Sachan, Manzil Zaheer, Ruslan Salakhutdinov• 2020

Related benchmarks

TaskDatasetResultRank
Text ClassificationAG News (test)--
210
Relation ExtractionTACRED (test)
F1 Score66.8
194
Sentiment ClassificationIMDB (test)
Error Rate4.32
144
Topic ClassificationDBPedia (test)--
64
Text ClassificationDBPedia (test)
Test Error Rate0.7
40
Text CategorizationRCV1 (test)
Error Rate6.23
24
Binary Sentiment ClassificationACL-IMDB (test)
Error Rate4.32
12
Fine-grained Sentiment ClassificationIMDB (test)
Error Rate (%)34.04
9
Text ClassificationAG (test)
Test Error Rate0.0495
9
Relation ExtractionSemEval-2010 Task 8 (test)--
8
Showing 10 of 14 rows

Other info

Code

Follow for update