nnUNet
nnUNet copied to clipboard
Question about dataloader in DDP
Hi Fabian,
In DDP training setting, pytorch recommends to use DistributedSampler together with DDP model to split dataset to different GPUs. I have read thoroughly your DDP trainer and getmoreDAaugmentation scripts and can not find where did you use this feature. So my question is, why didn't you use DistributedSampler together with DDPTrainer and how your dataloader work in DDP mode?
Hope to see your reply soon. Many thanks! BR,
nnU-Net always samples random cases during training. It does not iterate over some dataset object like pytorch likes to do. If you just pick random samples, you don't need DistributedSampler to coordinate the samples between the GPUs