nnUNet icon indicating copy to clipboard operation
nnUNet copied to clipboard

Multi-GPU nnUNet training

Open jhdezr1 opened this issue 1 year ago • 2 comments

Hi there!

I am trying to train a 3d_fullres model, but the patch size, despite maximizing the memory occupied in one of the GPUs, is too small. Hence, I would like to try multi-GPU training but I cannot find the argument in nnUNetv2_plan_and_preprocess to configure it that way.

I would appreciate some help!!

Thank you!

jhdezr1 avatar Sep 04 '24 08:09 jhdezr1

https://github.com/MIC-DKFZ/nnUNet/blob/master/documentation/how_to_use_nnunet.md#using-multiple-gpus-for-training

You would need ddp I guess so; nnUNetv2_train DATASET_NAME_OR_ID 2d 0 [--npz] -num_gpus X

aymuos15 avatar Sep 05 '24 15:09 aymuos15

Hi @jhdezr1, The simplest way to do so is by adding a custom plans file and using these plans for training.

You can do so by writing a custom planner for this, which is detailed in the general information about plans files (see below).

Adding a custom plan to the plans.json file inside the nnUNet_preprocessed folder and training a model with these plans is also shown exemplary here: https://github.com/MIC-DKFZ/nnUNet/blob/master/documentation/competitions/AutoPETII.md

For a general explanation of the plans files, see here: https://github.com/MIC-DKFZ/nnUNet/blob/master/documentation/explanation_plans_files.md

Best regards, Carsten

sten2lu avatar Sep 09 '24 09:09 sten2lu