Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Adaptive Prototype Learning and Allocation for Few-Shot Segmentation

About

Prototype learning is extensively used for few-shot segmentation. Typically, a single prototype is obtained from the support feature by averaging the global object information. However, using one prototype to represent all the information may lead to ambiguities. In this paper, we propose two novel modules, named superpixel-guided clustering (SGC) and guided prototype allocation (GPA), for multiple prototype extraction and allocation. Specifically, SGC is a parameter-free and training-free approach, which extracts more representative prototypes by aggregating similar feature vectors, while GPA is able to select matched prototypes to provide more accurate guidance. By integrating the SGC and GPA together, we propose the Adaptive Superpixel-guided Network (ASGNet), which is a lightweight model and adapts to object scale and shape variation. In addition, our network can easily generalize to k-shot segmentation with substantial improvement and no additional computational cost. In particular, our evaluations on COCO demonstrate that ASGNet surpasses the state-of-the-art method by 5% in 5-shot segmentation.

Gen Li, Varun Jampani, Laura Sevilla-Lara, Deqing Sun, Jonghyun Kim, Joongkyu Kim• 2021

Related benchmarks

TaskDatasetResultRank
Few-shot SegmentationPASCAL-5i
mIoU (Fold 0)64.6
325
Few-shot Semantic SegmentationPASCAL-5^i (test)
FB-IoU75.2
177
Few-shot SegmentationCOCO 20^i (test)
mIoU42.48
174
Semantic segmentationCOCO-20i
mIoU (Mean)42.5
132
Few-shot Semantic SegmentationCOCO-20i
mIoU42.5
115
Few-shot Semantic SegmentationPASCAL-5i
mIoU64.4
96
Few-shot SegmentationPASCAL 5i (val)
mIoU (Mean)59.31
83
Few-shot Semantic SegmentationCOCO-20i (test)--
79
Few-shot Semantic SegmentationCOCO-20i (val)
mIoU Mean42.48
78
Few-shot SegmentationPascal-5^i 1-way 1-shot
mIoU59.3
71
Showing 10 of 24 rows

Other info

Code

Follow for update