Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Co-Stack Residual Affinity Networks with Multi-level Attention Refinement for Matching Text Sequences

About

Learning a matching function between two text sequences is a long standing problem in NLP research. This task enables many potential applications such as question answering and paraphrase identification. This paper proposes Co-Stack Residual Affinity Networks (CSRAN), a new and universal neural architecture for this problem. CSRAN is a deep architecture, involving stacked (multi-layered) recurrent encoders. Stacked/Deep architectures are traditionally difficult to train, due to the inherent weaknesses such as difficulty with feature propagation and vanishing gradients. CSRAN incorporates two novel components to take advantage of the stacked architecture. Firstly, it introduces a new bidirectional alignment mechanism that learns affinity weights by fusing sequence pairs across stacked hierarchies. Secondly, it leverages a multi-level attention refinement component between stacked recurrent layers. The key intuition is that, by leveraging information across all network hierarchies, we can not only improve gradient flow but also improve overall performance. We conduct extensive experiments on six well-studied text sequence matching datasets, achieving state-of-the-art performance on all.

Yi Tay, Luu Anh Tuan, Siu Cheung Hui• 2018

Related benchmarks

TaskDatasetResultRank
Natural Language InferenceSNLI (test)
Accuracy88.7
681
Natural Language InferenceSciTail (test)
Accuracy86.7
86
Paraphrase IdentificationQuora Question Pairs (test)
Accuracy89.2
72
Showing 3 of 3 rows

Other info

Follow for update