torch-ngp icon indicating copy to clipboard operation
torch-ngp copied to clipboard

Can it run with torch.distributed, for example,i want to run with torch.distributed on 8 x V100s,how can i solve it or add some codes?

Open wacyfdyy opened this issue 2 years ago • 3 comments

wacyfdyy avatar Oct 10 '22 02:10 wacyfdyy

@wacyfdyy Sorry that this is not implemented. Maybe you would like to check ngp_pl, which supports parallel training.

ashawkey avatar Oct 10 '22 05:10 ashawkey

@wacyfdyy Sorry that this is not implemented. Maybe you would like to check ngp_pl, which supports parallel training. thanku for your answer。Now i am trying to add DDP codes.

wacyfdyy avatar Oct 10 '22 09:10 wacyfdyy

Can the non-cuda ray support multi-gpu training @ashawkey ?

Kartik-Teotia avatar Oct 10 '22 13:10 Kartik-Teotia