Skip to content

Latest commit

 

History

History
75 lines (52 loc) · 3.63 KB

README.md

File metadata and controls

75 lines (52 loc) · 3.63 KB

AISCT-SAM

The data and code for the paper "AISCT-SAM: A Clinical Knowledge-Driven Fine-Tuning Strategy for Applying Foundation Model to Fully Automatic Acute Ischemic Stroke Lesion Segmentation on Non-Contrast CT Scans" submitted to IEEE ICASSP 2025.

Requirements

CUDA 11.7
Python 3.10.13
Pytorch 2.0.0
Torchvision 0.15.0
batchgenerators 0.25
SimpleITK 2.3.0
scipy 1.11.3

Usage

0. Installation

  • Install our modified nnUNet as below
git clone https://github.com/GitHub-TXZ/AISCT-SAM.git
cd AISCT-SAM
pip install -e .

1 Acute Ischemic Stroke Dataset (AISD)

1.1 Dataset access

AISD dataset can be downloaded from (https://github.com/griffinliang/aisd).

1.2 Skull-stripping

After converting the DICOM files of the AISD dataset to NIfTI format, perform skull stripping according to the instructions at https://github.com/WuChanada/StripSkullCT.

1.3 Flip-Registration

Then, perform flip registration according to ./myscripts/Registration. Finally, organize the dataset in the nnUNet-expected format according to the code in nnUNet/nnunet/dataset_conversion.

1.4 Pre-processing

Some compared methods use the same pre-processing steps as nnUNet. The documentation of the pre-processing can be found at [DOC]

1.5 Training

conda activate Simply run the following in your command line:

  • Run CUDA_VISIBLE_DEVICES=0 nnUNetv2_train -dataset_name_or_id TASK_ID -model_name AIS_SAM -ex_name Ex1@b_2_p_20_256_256_s_3.0_0.4375_0.4375 for training.

1.6 Testing

  • Run CUDA_VISIBLE_DEVICES=0 nnUNetv2_train -dataset_name_or_id TASK_ID -model_name AIS_SAM -ex_name Ex1@b_2_p_20_256_256_s_3.0_0.4375_0.4375 --val for testing.

2.1 Pre-trained model

The pre-trained model of AISD dataset can be downloaded from [Baidu YUN] with the password "puyl".

2.2 Reproduction details and codes

During reproduction, for the CNN-based methods, Transformer-based methods, Hybrid-CNN-Transformer-based methods, Mamba-based mehtods. We integrated them into the nnUNet framework. All of these 3D methods can be found at [DOC].

For the AIS-Specific methods and SAM-based methods. We endeavored to implement them using our AIS datasets.our reproduced codes. The links of their open-source codes are listed as follows:

[Kuang et al.]
[UNet-RF]
[ADN]
[SAM-Med2D]
[SAM]
[SAM-Med3D]
[MedSAM]
[MSA]
[3DSAM Adapter]
[SAMed]

Note that for all compared methods, to perform fair comparisons, we used he same data split and all metrics were computed at the 3D image level.

Acknowledgements

Part of codes are reused from the nnU-Net, thanks to Fabian Isensee for the owesome codes of nnUNet. And we express our sincerest gratitude to all the awesome open-source code that we have used in our work.