BasicSR
BasicSR copied to clipboard
distributed training
your build_dataloader: if phase == 'train': if dist: # distributed training batch_size = dataset_opt['batch_size_per_gpu'] num_workers = dataset_opt['num_worker_per_gpu'] else: # non-distributed training multiplier = 1 if num_gpu == 0 else num_gpu batch_size = dataset_opt['batch_size_per_gpu'] * multiplier num_workers = dataset_opt['num_worker_per_gpu'] * multiplier
i think if else should exchange their position?
multiplier should present in distrbuted training
That's what I thought, but after I switched them around, I found it got worse. May I ask if you have any thoughts now? Can you clear up my confusion?