Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

A Joint Many-Task Model: Growing a Neural Network for Multiple NLP Tasks

About

Transfer and multi-task learning have traditionally focused on either a single source-target pair or very few, similar tasks. Ideally, the linguistic levels of morphology, syntax and semantics would benefit each other by being trained in a single model. We introduce a joint many-task model together with a strategy for successively growing its depth to solve increasingly complex tasks. Higher layers include shortcut connections to lower-level task predictions to reflect linguistic hierarchies. We use a simple regularization term to allow for optimizing all model weights to improve one task's loss without exhibiting catastrophic interference of the other tasks. Our single end-to-end model obtains state-of-the-art or competitive results on five different tasks from tagging, parsing, relatedness, and entailment tasks.

Kazuma Hashimoto, Caiming Xiong, Yoshimasa Tsuruoka, Richard Socher• 2016

Related benchmarks

TaskDatasetResultRank
ChunkingCoNLL 2000 (test)
F1 Score95.77
88
Dependency ParsingWSJ (test)
UAS94.67
67
Part-of-Speech TaggingPenn Treebank (test)
Accuracy97.55
64
Part-of-Speech TaggingWSJ (test)
Accuracy97.55
51
Part-of-Speech TaggingPOS (test)
Accuracy97.55
33
ChunkingChunk (test)
F1 Score95.77
28
Dependency ParsingDep. Parse (test)
UAS94.7
23
Textual EntailmentSICK (test)
Accuracy86.8
21
Dependency ParsingPenn Treebank (PTB) Section 23 v2.2 (test)
UAS94.67
17
ParsingEnglish PTB-SD 3.3.0 (test)
UAS94.67
7
Showing 10 of 11 rows

Other info

Code

Follow for update