Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

On Regularized Losses for Weakly-supervised CNN Segmentation

About

Minimization of regularized losses is a principled approach to weak supervision well-established in deep learning, in general. However, it is largely overlooked in semantic segmentation currently dominated by methods mimicking full supervision via "fake" fully-labeled training masks (proposals) generated from available partial input. To obtain such full masks the typical methods explicitly use standard regularization techniques for "shallow" segmentation, e.g. graph cuts or dense CRFs. In contrast, we integrate such standard regularizers directly into the loss functions over partial input. This approach simplifies weakly-supervised training by avoiding extra MRF/CRF inference steps or layers explicitly generating full masks, while improving both the quality and efficiency of training. This paper proposes and experimentally compares different losses integrating MRF/CRF regularization terms. We juxtapose our regularized losses with earlier proposal-generation methods using explicit regularization steps or layers. Our approach achieves state-of-the-art accuracy in semantic segmentation with near full-supervision quality.

Meng Tang, Federico Perazzi, Abdelaziz Djelouah, Ismail Ben Ayed, Christopher Schroers, Yuri Boykov• 2018

Related benchmarks

TaskDatasetResultRank
Semantic segmentationADE20K (val)
mIoU37.4
2731
Semantic segmentationPASCAL VOC 2012 (val)
Mean IoU76.8
2040
Semantic segmentationPASCAL VOC 2012 (test)
mIoU75
1342
Semantic segmentationCityscapes (val)
mIoU69.3
572
Semantic segmentationPascal VOC augmented 2012 (val)
mIoU75
162
Semantic segmentationPascal VOC 21 classes (val)
mIoU75.8
103
Medical Image SegmentationACDC (5-fold cross-validation)
Mean DSC0.856
26
Semantic segmentationCityscapes (val)
mIoU (10%)57.4
6
Semantic segmentationDensePose 2014 (minival)
mIoU31.3
2
Showing 9 of 9 rows

Other info

Follow for update