Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

SegMamba: Long-range Sequential Modeling Mamba For 3D Medical Image Segmentation

About

The Transformer architecture has shown a remarkable ability in modeling global relationships. However, it poses a significant computational challenge when processing high-dimensional medical images. This hinders its development and widespread adoption in this task. Mamba, as a State Space Model (SSM), recently emerged as a notable manner for long-range dependencies in sequential modeling, excelling in natural language processing filed with its remarkable memory efficiency and computational speed. Inspired by its success, we introduce SegMamba, a novel 3D medical image \textbf{Seg}mentation \textbf{Mamba} model, designed to effectively capture long-range dependencies within whole volume features at every scale. Our SegMamba, in contrast to Transformer-based methods, excels in whole volume feature modeling from a state space model standpoint, maintaining superior processing speed, even with volume features at a resolution of {$64\times 64\times 64$}. Comprehensive experiments on the BraTS2023 dataset demonstrate the effectiveness and efficiency of our SegMamba. The code for SegMamba is available at: https://github.com/ge-xing/SegMamba

Zhaohu Xing, Tian Ye, Yijun Yang, Guang Liu, Lei Zhu• 2024

Related benchmarks

TaskDatasetResultRank
Brain Tumor SegmentationBraTS 2023 (test)
WT Dice93.61
49
Brain Tumor SegmentationBraTS 2020
DSC (WT)90.77
27
Multi-organ SegmentationBTCV
Spl Score92.32
22
Medical Lesion SegmentationBreast Lesion
Dice82.7
21
Medical Lesion SegmentationLung Infection
Dice Score68
21
Neuron SegmentationCREMI-A (test)
VIs0.565
20
Neuron SegmentationCREMI-B (test)
VIs1.126
20
Neuron SegmentationCREMI-C (test)
VIs1.03
20
Neuron SegmentationAC3/AC4 (test)
VIs0.801
20
CBCT segmentationInternal dataset for CBCT segmentation
DSC71.73
18
Showing 10 of 29 rows

Other info

Code

Follow for update