Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

SATr: Slice Attention with Transformer for Universal Lesion Detection

About

Universal Lesion Detection (ULD) in computed tomography plays an essential role in computer-aided diagnosis. Promising ULD results have been reported by multi-slice-input detection approaches which model 3D context from multiple adjacent CT slices, but such methods still experience difficulty in obtaining a global representation among different slices and within each individual slice since they only use convolution-based fusion operations. In this paper, we propose a novel Slice Attention Transformer (SATr) block which can be easily plugged into convolution-based ULD backbones to form hybrid network structures. Such newly formed hybrid backbones can better model long-distance feature dependency via the cascaded self-attention modules in the Transformer block while still holding a strong power of modeling local features with the convolutional operations in the original backbone. Experiments with five state-of-the-art methods show that the proposed SATr block can provide an almost free boost to lesion detection accuracy without extra hyperparameters or special network designs.

Han Li, Long Chen, Hu Han, S. Kevin Zhou• 2022

Related benchmarks

TaskDatasetResultRank
Universal Lesion DetectionDeepLesion universal (test)
Sensitivity @ 0.5 FPPI75.24
34
Universal Lesion DetectionDeepLesion official (test)
Sensitivity (0.5 FPPI)81.03
20
Universal Lesion DetectionDeepLesion standard (test)
Sensitivity @ 0.5 FPPI81.02
13
Lesion DetectionDeepLesion Lesion-Harvester augmented (test)
Sensitivity @ 0.5 FPPI91.04
3
Showing 4 of 4 rows

Other info

Follow for update