Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

UP-DETR: Unsupervised Pre-training for Object Detection with Transformers

About

DEtection TRansformer (DETR) for object detection reaches competitive performance compared with Faster R-CNN via a transformer encoder-decoder architecture. However, trained with scratch transformers, DETR needs large-scale training data and an extreme long training schedule even on COCO dataset. Inspired by the great success of pre-training transformers in natural language processing, we propose a novel pretext task named random query patch detection in Unsupervised Pre-training DETR (UP-DETR). Specifically, we randomly crop patches from the given image and then feed them as queries to the decoder. The model is pre-trained to detect these query patches from the input image. During the pre-training, we address two critical issues: multi-task learning and multi-query localization. (1) To trade off classification and localization preferences in the pretext task, we find that freezing the CNN backbone is the prerequisite for the success of pre-training transformers. (2) To perform multi-query localization, we develop UP-DETR with multi-query patch detection with attention mask. Besides, UP-DETR also provides a unified perspective for fine-tuning object detection and one-shot detection tasks. In our experiments, UP-DETR significantly boosts the performance of DETR with faster convergence and higher average precision on object detection, one-shot detection and panoptic segmentation. Code and pre-training models: https://github.com/dddzg/up-detr.

Zhigang Dai, Bolun Cai, Yugeng Lin, Junying Chen• 2020

Related benchmarks

TaskDatasetResultRank
Object DetectionCOCO 2017 (val)
AP42.8
2454
Object DetectionMS-COCO (val)
mAP0.428
138
Object DetectionPASCAL VOC 2007 (test)
AP57.2
18
Class-agnostic Object DetectionMS-COCO 2017 (val)
AP (Overall)0.001
15
Object DetectionMS-COCO In-Domain (val)
D-ECE25.5
6
Object DetectionCorCOCO Out-Domain (val)
D-ECE27.5
6
Showing 6 of 6 rows

Other info

Follow for update