Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

MTI-Net: Multi-Scale Task Interaction Networks for Multi-Task Learning

About

In this paper, we argue about the importance of considering task interactions at multiple scales when distilling task information in a multi-task learning setup. In contrast to common belief, we show that tasks with high affinity at a certain scale are not guaranteed to retain this behaviour at other scales, and vice versa. We propose a novel architecture, namely MTI-Net, that builds upon this finding in three ways. First, it explicitly models task interactions at every scale via a multi-scale multi-modal distillation unit. Second, it propagates distilled task information from lower to higher scales via a feature propagation module. Third, it aggregates the refined task features from all scales via a feature aggregation unit to produce the final per-task predictions. Extensive experiments on two multi-task dense labeling datasets show that, unlike prior work, our multi-task model delivers on the full potential of multi-task learning, that is, smaller memory footprint, reduced number of calculations, and better performance w.r.t. single-task learning. The code is made publicly available: https://github.com/SimonVandenhende/Multi-Task-Learning-PyTorch.

Simon Vandenhende, Stamatios Georgoulis, Luc Van Gool• 2020

Related benchmarks

TaskDatasetResultRank
Depth EstimationNYU v2 (test)--
432
Semantic segmentationPASCAL Context (val)
mIoU61.7
360
Semantic segmentationNYU v2 (test)
mIoU49
282
Surface Normal EstimationNYU v2 (test)
Mean Angle Distance (MAD)20.27
224
Semantic segmentationCityscapes
mIoU59.85
218
Depth EstimationNYU Depth V2
RMSE0.529
209
Semantic segmentationNYUD v2 (test)
mIoU45.97
187
Semantic segmentationNYU Depth V2 (test)
mIoU49
183
Semantic segmentationNYUD v2
mIoU49
125
Semantic segmentationNYUDv2 40-class (test)
mIoU49
99
Showing 10 of 38 rows

Other info

Code

Follow for update