Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Rethinking Transformer-based Set Prediction for Object Detection

About

DETR is a recently proposed Transformer-based method which views object detection as a set prediction problem and achieves state-of-the-art performance but demands extra-long training time to converge. In this paper, we investigate the causes of the optimization difficulty in the training of DETR. Our examinations reveal several factors contributing to the slow convergence of DETR, primarily the issues with the Hungarian loss and the Transformer cross-attention mechanism. To overcome these issues we propose two solutions, namely, TSP-FCOS (Transformer-based Set Prediction with FCOS) and TSP-RCNN (Transformer-based Set Prediction with RCNN). Experimental results show that the proposed methods not only converge much faster than the original DETR, but also significantly outperform DETR and other baselines in terms of detection accuracy.

Zhiqing Sun, Shengcao Cao, Yiming Yang, Kris Kitani• 2020

Related benchmarks

TaskDatasetResultRank
Object DetectionCOCO 2017 (val)
AP46.5
2454
Object DetectionCOCO (test-dev)
mAP47.4
1195
Object DetectionCOCO (val)
mAP45
613
Object DetectionCOCO v2017 (test-dev)
mAP46.6
499
Object DetectionMS-COCO 2017 (val)--
237
Object DetectionMS-COCO (val)
mAP0.45
138
Showing 6 of 6 rows

Other info

Follow for update