Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Prompt-Free SAM-Based Multi-Task Framework for Breast Ultrasound Lesion Segmentation and Classification

About

Accurate tumor segmentation and classification in breast ultrasound (BUS) imaging remain challenging due to low contrast, speckle noise, and diverse lesion morphology. This study presents a multi-task deep learning framework that jointly performs lesion segmentation and diagnostic classification using embeddings from the Segment Anything Model (SAM) vision encoder. Unlike prompt-based SAM variants, our approach employs a prompt-free, fully supervised adaptation where high-dimensional SAM features are decoded through either a lightweight convolutional head or a UNet-inspired decoder for pixel-wise segmentation. The classification branch is enhanced via mask-guided attention, allowing the model to focus on lesion-relevant features while suppressing background artifacts. Experiments on the PRECISE 2025 breast ultrasound dataset, split per class into 80 percent training and 20 percent testing, show that the proposed method achieves a Dice Similarity Coefficient (DSC) of 0.887 and an accuracy of 92.3 percent, ranking among the top entries on the PRECISE challenge leaderboard. These results demonstrate that SAM-based representations, when coupled with segmentation-guided learning, significantly improve both lesion delineation and diagnostic prediction in breast ultrasound imaging.

Samuel E. Johnny, Bernes L. Atabonfack, Israel Alagbe, Assane Gueye• 2026

Related benchmarks

TaskDatasetResultRank
Breast Cancer ClassificationPRECISE breast ultrasound 2025 (test)
Accuracy90.7
4
Breast Cancer SegmentationPRECISE breast ultrasound dataset 2025 (test)
DSC88.7
4
Showing 2 of 2 rows

Other info

Follow for update