Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Attentive Single-Tasking of Multiple Tasks

About

In this work we address task interference in universal networks by considering that a network is trained on multiple tasks, but performs one task at a time, an approach we refer to as "single-tasking multiple tasks". The network thus modifies its behaviour through task-dependent feature adaptation, or task attention. This gives the network the ability to accentuate the features that are adapted to a task, while shunning irrelevant ones. We further reduce task interference by forcing the task gradients to be statistically indistinguishable through adversarial training, ensuring that the common backbone architecture serving all tasks is not dominated by any of the task-specific gradients. Results in three multi-task dense labelling problems consistently show: (i) a large reduction in the number of parameters while preserving, or even improving performance and (ii) a smooth trade-off between computation and multi-task accuracy. We provide our system's code and pre-trained models at http://vision.ee.ethz.ch/~kmaninis/astmt/.

Kevis-Kokitsi Maninis, Ilija Radosavovic, Iasonas Kokkinos• 2019

Related benchmarks

TaskDatasetResultRank
Semantic segmentationPASCAL Context (val)
mIoU68
323
Depth EstimationNYU Depth V2--
177
Facial Attribute ClassificationCelebA--
163
Surface Normal PredictionNYU V2
Mean Error32.22
100
Saliency DetectionPascal Context (test)
maxF65.7
57
Surface Normal EstimationPascal Context (test)
mErr14.7
50
Boundary DetectionPascal Context (test)
ODSF72.4
34
Depth EstimationCityscapes
Abs. Err.0.016
22
Human Part ParsingPascal Context (test)
mIoU61.1
20
Saliency DetectionPASCAL-Context clean and adverse conditions v1 (unseen)
mIoU61.3
19
Showing 10 of 22 rows

Other info

Follow for update