accelerate
accelerate copied to clipboard
Slurm support
Hey, I am wondering if you accelerate supports SLURM and if so, how does one run accelerate on slurm in a multi GPU setting? Thanks, Eliahu
We don't have support for SLURM right now :-)
Thanks for the quick answer!
I am using a repo that was written with accelerate but using a slurm cluster. Assuming I am successful in launching the code with torch.distributed.launch, should the current code written with the accelerate API support it, or will I need to refactor the code to support PyTorch DDP?
It should all work as long as you can launch it!
Perfect, thanks!
Could you please reopen this issue? I think I can work on this, maybe we can have some configuration worth the help of submitit
Could you please reopen this issue? I think I can work on this, maybe we can have some configuration worth the help of submitit
Reopened, thanks!
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.