Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

AdaShare: Learning What To Share For Efficient Deep Multi-Task Learning

About

Multi-task learning is an open and challenging problem in computer vision. The typical way of conducting multi-task learning with deep neural networks is either through handcrafted schemes that share all initial layers and branch out at an adhoc point, or through separate task-specific networks with an additional feature sharing/fusion mechanism. Unlike existing methods, we propose an adaptive sharing approach, called AdaShare, that decides what to share across which tasks to achieve the best recognition accuracy, while taking resource efficiency into account. Specifically, our main idea is to learn the sharing pattern through a task-specific policy that selectively chooses which layers to execute for a given task in the multi-task network. We efficiently optimize the task-specific policy jointly with the network weights, using standard back-propagation. Experiments on several challenging and diverse benchmark datasets with a variable number of tasks well demonstrate the efficacy of our approach over state-of-the-art methods. Project page: https://cs-people.bu.edu/sunxm/AdaShare/project.html.

Ximeng Sun, Rameswar Panda, Rogerio Feris, Kate Saenko• 2019

Related benchmarks

TaskDatasetResultRank
Semantic segmentationCityscapes (test)
mIoU41.5
1145
Depth EstimationNYU v2 (test)--
423
Image ClassificationCUB
Accuracy86.2
249
Semantic segmentationNYU v2 (test)
mIoU30.2
248
Surface Normal EstimationNYU v2 (test)
Mean Angle Distance (MAD)16.6
206
Depth EstimationNYU Depth V2--
177
Surface Normal PredictionNYU V2
Mean Error16.6
100
Semantic segmentationNYU V2
mIoU30.2
74
Multi-Task AdaptationPascal Context (test)--
70
Monocular Depth EstimationCityscapes
Accuracy (delta < 1.25)71.4
62
Showing 10 of 58 rows

Other info

Code

Follow for update