MHA-UNet icon indicating copy to clipboard operation
MHA-UNet copied to clipboard

[arXiv] The official code for "Only Positive Cases: 5-fold High-order Attention Interaction Model for Skin Segmentation Derived Classification".

Only Positive Cases: 5-fold High-order Attention Interaction Model for Skin Segmentation Derived Classification

Renkai Wu, Yinghao Liu, Pengchen Liang*, and Qing Chang*

arXiv

NewsπŸš€

(2024.03.02) Add video to project page

(2023.12.06) Prepare_ISIC2017.py file to include reminders. Please prepare the data according to the reminders, otherwise there may be an error in preparing the fileβœ…

(2023.12.01) The process and code for processing negative samples used to test the classification ability of the model is now onlineπŸ”₯πŸ”₯

(2023.11.28) The arXiv paper version is publicly availableπŸ“ƒπŸ“ƒ

(2023.11.28) You can download weight files of MHA-UNet here Google Drive Baidu Drive(btsd) . πŸ”₯

(2023.11.26) The project code has been uploadedπŸ”₯

(2023.11.25) The first edition of our paper has been uploaded to arXiv πŸ“ƒ

https://github.com/wurenkai/MHA-UNet/assets/124028634/921049af-4797-4db6-b6bf-b727df82ba56

0. Main Environments.

  • python 3.8
  • pytorch 1.12.0

1. Prepare the Dataset and Pretrained weights.

A.Dataset 1- Download the ISIC 2017 train dataset from this link and extract both training dataset and ground truth folders inside the /data/dataset_isic17/. 2- Run Prepare_ISIC2017.py for data preperation and dividing data to train,validation and test sets.

Notice: For training and evaluating on ISIC 2018, pH2, NormalSkin and Kaggle95 follow the bellow steps: : 1- Download the ISIC 2018 train dataset from this link and extract both training dataset and ground truth folders inside the /data/dataset_isic18/. then Run Prepare_ISIC2018.py for data preperation and dividing data to train,validation and test sets. 2- Download the pH2 dataset from this link and extract it then Run Prepare_PH2_test.py for data preperation and dividing data to train,validation and test sets. 3- Download the NormalSkin dataset from this link. 4- Download the Kaggle95 dataset from this link. 5- The NormalSkin dataset and the Kaggle95 dataset were used as negative samples to test the model classification ability. For preparing these negative test samples, the data can be processed in the following way: #0 The negative dataset is without segmentation labels. It is possible to generate the same number of all-black labels as the original image in the following way:

python generate_black.py

It should be noted that the number of generated images in the generate_black.py file needs to be modified.

#1 When the same number of all-black labels are available, the following command is executed to generate test data:

python Prepare_Neg_test.py

It should be noted that the number of images in the Prepare_Neg_test.py file needs to be modified.

B.Pretrained weights The pretrained weights pth file can be obtained from Google Drive. Baidu Drive(btsd)

2. Train the MHA-UNet.

python train.py
  • After trianing, you could obtain the outputs in './results/'

3. Test the MHA-UNet. First, in the test.py file, you should change the address of the checkpoint in 'resume_model' and fill in the location of the test data in 'data_path'.

python test.py
  • After testing, you could obtain the outputs in './results/'

4. Get the MHA-UNet explainable results map and EICA calculations. First, in the test_Explainable.py file, you should change the address of the checkpoint in 'resume_model' and fill in the location of the test data in 'data_path'.

python test_Explainable.py
  • After testing, you could obtain the outputs in './results/'. EICA is calculated for each case. EICA threshold defaults to 225. The final display 'Detected as true(number):' is the number of all detected as positive.

Citation

If you find this repository helpful, please consider citing:

@article{wu2023only,
  title={Only Positive Cases: 5-fold High-order Attention Interaction Model for Skin Segmentation Derived Classification},
  author={Wu, Renkai and Liu, Yinghao and Liang, Pengchen and Chang, Qing},
  journal={arXiv preprint arXiv:2311.15625},
  year={2023}
}

Acknowledgement

This repo benefits from awesome works of HorNet, MHorUNet.