Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Dynamic Self-Attention : Computing Attention over Words Dynamically for Sentence Embedding

About

In this paper, we propose Dynamic Self-Attention (DSA), a new self-attention mechanism for sentence embedding. We design DSA by modifying dynamic routing in capsule network (Sabouretal.,2017) for natural language processing. DSA attends to informative words with a dynamic weight vector. We achieve new state-of-the-art results among sentence encoding methods in Stanford Natural Language Inference (SNLI) dataset with the least number of parameters, while showing comparative results in Stanford Sentiment Treebank (SST) dataset.

Deunsol Yoon, Dongbok Lee, SangKeun Lee• 2018

Related benchmarks

TaskDatasetResultRank
Natural Language InferenceSNLI (test)
Accuracy87.4
681
Sentiment AnalysisSST-5 (test)
Accuracy50.6
173
Sentiment AnalysisSST-2 (test)
Accuracy88.5
136
Natural Language InferenceSNLI 1.0 (test)
Accuracy87.4
19
Sentiment AnalysisStanford Sentiment Treebank (SST) (test)
Accuracy50.6
10
Natural Language InferenceSNLI 1.0 (train)
Accuracy89
9
Showing 6 of 6 rows

Other info

Follow for update