Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Multi-Task Deep Neural Networks for Natural Language Understanding

About

In this paper, we present a Multi-Task Deep Neural Network (MT-DNN) for learning representations across multiple natural language understanding (NLU) tasks. MT-DNN not only leverages large amounts of cross-task data, but also benefits from a regularization effect that leads to more general representations in order to adapt to new tasks and domains. MT-DNN extends the model proposed in Liu et al. (2015) by incorporating a pre-trained bidirectional transformer language model, known as BERT (Devlin et al., 2018). MT-DNN obtains new state-of-the-art results on ten NLU tasks, including SNLI, SciTail, and eight out of nine GLUE tasks, pushing the GLUE benchmark to 82.7% (2.2% absolute improvement). We also demonstrate using the SNLI and SciTail datasets that the representations learned by MT-DNN allow domain adaptation with substantially fewer in-domain labels than the pre-trained BERT representations. The code and pre-trained models are publicly available at https://github.com/namisan/mt-dnn.

Xiaodong Liu, Pengcheng He, Weizhu Chen, Jianfeng Gao• 2019

Related benchmarks

TaskDatasetResultRank
Natural Language InferenceSNLI (test)
Accuracy91.7
681
Natural Language UnderstandingGLUE (dev)
SST-2 (Acc)94.3
504
Natural Language UnderstandingGLUE
SST-295.6
452
Natural Language UnderstandingGLUE (test)
SST-2 Accuracy96.5
416
Natural Language InferenceSNLI
Accuracy91.6
174
Natural Language UnderstandingGLUE (val)
SST-294.3
170
Natural Language InferenceSciTail (test)
Accuracy95
86
Natural Language UnderstandingSuperGLUE
SGLUE Score71.26
84
Natural Language InferenceSNLI (dev)
Accuracy92.2
71
General Language UnderstandingGLUE v1 (test dev)
MNLI84.95
40
Showing 10 of 20 rows

Other info

Code

Follow for update