nnUNet icon indicating copy to clipboard operation
nnUNet copied to clipboard

finetuning nnUNet v1 Task082_BraTS2020

Open noehsueh opened this issue 10 months ago • 7 comments

Hi,

I'm trying to finetune nnUNet for a brain tumor segmentation task using pretrained weights from Task082_BraTS2020. My data consists of brain MRI images with 4 input modalities, and the labels match those used in Brats. I have followed the instructions in issue #774, but my results are unsatisfactory. I suspect there may be errors in my data preprocessing. Here's the code I used after exporting the environmental variables (I've replaced the Brats data with my custom data intended for finetuning in nnUNet_raw_data_base/nnUNet_raw_data/Task082_BraTS2020) and downloaded the pretrained model:

nnUNet_plan_and_preprocess -t 082 --verify_dataset_integrity

Then, for all the folds (0, 1, 2, 3, 4), I ran the following command:

nnUNet_train 3d_fullres nnUNetTrainerV2_250epochs 82 0 --npz -pretrained_weights '../model/nnUNet/3d_fullres/Task082_BraTS2020/nnUNetTrainerV2__nnUNetPlansv2.1/fold_0/model_final_checkpoint.model'

Where nnUNetTrainerV2_250epochs is a custom trainer I created to finetune the model.

I haven't overwritten the plans because I thought that it would just preprocess the files using the Task082 plans. Should I have used the -overwrite_plans command instead? If so, what should the -overwrite_plans_identifier be?

Many thanks! Noe

noehsueh avatar Apr 17 '24 14:04 noehsueh

Hi @noehsueh , have you already tried following the documentation here: https://github.com/MIC-DKFZ/nnUNet/blob/master/documentation/pretraining_and_finetuning.md

Let me know if that clarifies the use of overwrite_plans!

GregorKoehler avatar Apr 22 '24 14:04 GregorKoehler

Hi @GregorKoehler,

Thanks for your reply. I'm using nnUNet v1. The documentation seems to work only for v2, but if I understood it correctly, the plan files contains the network topology, data processing, etc. it also requires training on the source dataset. As my source dataset is BraTS 2020 and I've already got the weights, I was wondering if there's a way of fine-tune the model without training nnUNet on Task082_BraTS2020 again. I ran this command to check that my data matches the ones of BraTS2020: nnUNet_plan_and_preprocess -t 082 --verify_dataset_integrity. But didn't transfer any plans files between the source and the target. Just wondering if this was the right approach.

Thanks All the best, Noe

noehsueh avatar Apr 22 '24 16:04 noehsueh

Hey Noe, it's very hard for us to debug old nnU-Net V1 issues when v2 has been out for a long time. Please try using that and it will be a lot easier to help you. We have also improved the documentation a lot. Best, Fabian

FabianIsensee avatar Jun 04 '24 13:06 FabianIsensee

Hi Fabian,

I've been using nnUNet v1 as the pertained weights I found for BraTS 2020 were only available for this version. I would like to know if pertained models weights will be available for v2. Unfortunately, I have limited computational resources and might not be able to train the model from scratch.

Best, Noe

noehsueh avatar Jun 04 '24 13:06 noehsueh

Hi @noehsueh, I'm looking into making pretrained weights available for BraTS 2020 in v2 and will post an update here soon. I'm certain we should be able to provide pretrained weights for this dataset.

Best, Gregor

GregorKoehler avatar Jun 16 '24 10:06 GregorKoehler

Hi Gregor,

That would be very helpful, thanks!

Best Noe

noehsueh avatar Jun 16 '24 10:06 noehsueh

Hi @noehsueh,

please excuse the late update on this issue. I was sadly unable to find pretrained weights in nnUNet v2 for this dataset. I did, however, notice that I have the dataset stored in nnUNet v2 format. I could run the training for it and try to share the weights with you. What setup are you after? I would propose to run the currently recommended residual encoder preset (https://github.com/MIC-DKFZ/nnUNet/blob/master/documentation/resenc_presets.md) for fold "all" (so one run on the whole dataset). Would this be helpful for you?

GregorKoehler avatar Sep 06 '24 11:09 GregorKoehler