Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Sentence-State LSTM for Text Representation

About

Bi-directional LSTMs are a powerful tool for text representation. On the other hand, they have been shown to suffer various limitations due to their sequential nature. We investigate an alternative LSTM structure for encoding text, which consists of a parallel state for each word. Recurrent steps are used to perform local and global information exchange between words simultaneously, rather than incremental reading of a sequence of words. Results on various classification and sequence labelling benchmarks show that the proposed model has strong representation power, giving highly competitive performances compared to stacked BiLSTM models with similar parameter numbers.

Yue Zhang, Qi Liu, Linfeng Song• 2018

Related benchmarks

TaskDatasetResultRank
Named Entity RecognitionCoNLL 2003 (test)
F1 Score91.57
539
Named Entity RecognitionCoNLL English 2003 (test)
F1 Score91.57
135
Natural Language UnderstandingSnips (test)
Intent Acc98.3
27
POS TaggingPTB (test)
Accuracy97.55
24
Spoken Language UnderstandingATIS (test)
Slot F195.65
18
Text Classificationmovie review dataset (test)
Accuracy82.45
12
Text ClassificationMTL-16 (test)
Average Accuracy85.38
4
Intent DetectionCAIS
Accuracy94.36
3
Slot FillingCAIS
F1 Score85.74
3
Showing 9 of 9 rows

Other info

Code

Follow for update