torch-ngp
torch-ngp copied to clipboard
Can it run with torch.distributed, for example,i want to run with torch.distributed on 8 x V100s,how can i solve it or add some codes?
@wacyfdyy Sorry that this is not implemented. Maybe you would like to check ngp_pl, which supports parallel training.
@wacyfdyy Sorry that this is not implemented. Maybe you would like to check ngp_pl, which supports parallel training. thanku for your answer。Now i am trying to add DDP codes.
Can the non-cuda ray support multi-gpu training @ashawkey ?