torchtune icon indicating copy to clipboard operation
torchtune copied to clipboard

fixed mixed precision in FSDP

Open denadai2 opened this issue 4 months ago • 3 comments

Context

  • FSDP initializes but does not use MixedPrecision

denadai2 avatar Apr 18 '24 17:04 denadai2

:link: Helpful Links

:test_tube: See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/torchtune/796

Note: Links to docs will display an error until the docs builds have been completed.

This comment was automatically generated by Dr. CI and updates every 15 minutes.

pytorch-bot[bot] avatar Apr 18 '24 17:04 pytorch-bot[bot]

Hi @denadai2!

Thank you for your pull request and welcome to our community.

Action Required

In order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you.

Process

In order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with CLA signed. The tagging process may take up to 1 hour after signing. Please give it that time before contacting us about it.

If you have received this in error or have any questions, please contact us at [email protected]. Thanks!

facebook-github-bot avatar Apr 18 '24 17:04 facebook-github-bot

Hi @denadai2 thanks for the PR! Actually this utility is not used in any of our recipes (code search shows it's only in the unit test and init file). So I think this is dead code anyways. In fact we don't really support mixed precision training and instead prefer to use true bf16 training, which requires less memory. Let me know if this makes sense

ebsmothers avatar Apr 18 '24 18:04 ebsmothers