Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

An Overview of Multi-Task Learning in Deep Neural Networks

About

Multi-task learning (MTL) has led to successes in many applications of machine learning, from natural language processing and speech recognition to computer vision and drug discovery. This article aims to give a general overview of MTL, particularly in deep neural networks. It introduces the two most common methods for MTL in Deep Learning, gives an overview of the literature, and discusses recent advances. In particular, it seeks to help ML practitioners apply MTL by shedding light on how MTL works and providing guidelines for choosing appropriate auxiliary tasks.

Sebastian Ruder• 2017

Related benchmarks

TaskDatasetResultRank
Emotion ClassificationEmotion Classification In-domain (test)
F1 Score52.14
128
Long-term Action AnticipationEgo4D v1 (test)
ED@Z=20 Verb0.74
31
State change classificationEgo4D v1 (test)
Accuracy71.1
29
Action RecognitionEgo4D v1 (test)
Top-1 Accuracy (Verb)22.05
23
Emotion ClassificationEmotion (Out-of-domain)
F1 Score0.3145
22
Point-of-no-return temporal localizationEgo4D v1 (test)
Error0.62
21
Graph Algorithmic ReasoningCLRS (test)
BFS Accuracy0.986
14
Multi-objective RecommendationKuaishou (offline)
Consistency32.56
9
Multi-objective RecommendationAlibaba-Youku (offline)
VV72.31
9
Multi-objective RecommendationYelp (offline)
Relevance0.6677
9
Showing 10 of 19 rows

Other info

Follow for update