Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

A Hierarchical Multi-task Approach for Learning Embeddings from Semantic Tasks

About

Much effort has been devoted to evaluate whether multi-task learning can be leveraged to learn rich representations that can be used in various Natural Language Processing (NLP) down-stream applications. However, there is still a lack of understanding of the settings in which multi-task learning has a significant effect. In this work, we introduce a hierarchical model trained in a multi-task learning setup on a set of carefully selected semantic tasks. The model is trained in a hierarchical fashion to introduce an inductive bias by supervising a set of low level tasks at the bottom layers of the model and more complex tasks at the top layers of the model. This model achieves state-of-the-art results on a number of tasks, namely Named Entity Recognition, Entity Mention Detection and Relation Extraction without hand-engineered features or external NLP tools like syntactic parsers. The hierarchical training supervision induces a set of shared semantic representations at lower layers of the model. We show that as we move from the bottom to the top layers of the model, the hidden states of the layers tend to represent more complex semantic information.

Victor Sanh, Thomas Wolf, Sebastian Ruder• 2018

Related benchmarks

TaskDatasetResultRank
Named Entity RecognitionCoNLL 03
F1 (Entity)91.63
102
Relation ExtractionACE05 (test)
F1 Score62.7
72
Entity extractionACE05 (test)
F1 Score87.5
53
Named Entity RecognitionACE05
F1 Score87.51
38
Coreference ResolutionCoNLL 2012
Average F170.14
17
Entity Mention DetectionACE05 (351/80/80)
Precision87.03
12
Relation ExtractionACE 2005 (351/80/80)
Precision68.66
8
Coreference ResolutionACE05 (351/80/80)
MUC75.73
5
Coreference ResolutionACE05
MUC82.49
4
Showing 9 of 9 rows

Other info

Follow for update