Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Learning Multiple Dense Prediction Tasks from Partially Annotated Data

About

Despite the recent advances in multi-task learning of dense prediction problems, most methods rely on expensive labelled datasets. In this paper, we present a label efficient approach and look at jointly learning of multiple dense prediction tasks on partially annotated data (i.e. not all the task labels are available for each image), which we call multi-task partially-supervised learning. We propose a multi-task training procedure that successfully leverages task relations to supervise its multi-task learning when data is partially annotated. In particular, we learn to map each task pair to a joint pairwise task-space which enables sharing information between them in a computationally efficient way through another network conditioned on task pairs, and avoids learning trivial cross-task relations by retaining high-level information about the input image. We rigorously demonstrate that our proposed method effectively exploits the images with unlabelled tasks and outperforms existing semi-supervised learning approaches and related methods on three standard benchmarks.

Wei-Hong Li, Xialei Liu, Hakan Bilen• 2021

Related benchmarks

TaskDatasetResultRank
Depth EstimationNYU v2 (test)--
432
Semantic segmentationNYU v2 (test)
mIoU29.65
282
Surface Normal EstimationNYU v2 (test)
Mean Angle Distance (MAD)31.06
224
Depth EstimationNYU Depth V2--
209
Semantic segmentationNYUD v2 (test)
mIoU34.26
187
Semantic segmentationNYUD v2
mIoU41
125
Surface Normal PredictionNYU V2
Mean Error28.58
118
Multi-Object TrackingnuScenes (val)
AMOTA25.5
33
Online MappingnuScenes (val)--
32
Showing 9 of 9 rows

Other info

Follow for update