torchdistx
torchdistx copied to clipboard
Support PyTorch 2.1.0
What does this PR do:
Fixing compatibility issues with PyTorch versions above 2.1.0, corresponding issue: #79
Does your PR introduce any breaking changes? If yes, please list them:
ProxyVariableHooks::basic_autograd_not_implemented_fallback
Hi @Seventeen17!
Thank you for your pull request and welcome to our community.
Action Required
In order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you.
Process
In order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.
Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with CLA signed
. The tagging process may take up to 1 hour after signing. Please give it that time before contacting us about it.
If you have received this in error or have any questions, please contact us at [email protected]. Thanks!
Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Meta Open Source project. Thanks!
@H-Huang Could you please help me check what's wrong with runners?
Hi @Seventeen17 thank you for adding this PR. I managed to install the torchdistx with your PR. However, I failed to initialize the 65B model with FSDP. Did you manage to initialize a large model? Thanks in advance for your kindness.
@Seventeen17 I managed to train the large model. Thanks again for your contribution.
Hi @Seventeen17 thank you for adding this PR. I managed to install the torchdistx with your PR. However, I failed to initialize the 65B model with FSDP. Did you manage to initialize a large model? Thanks in advance for your kindness.
I applied this fix to successfully initialize and train 65B model in PyTorch/XLA FSDP, but it also need some fix in PyTorch/XLA FSDP.