torchtune icon indicating copy to clipboard operation
torchtune copied to clipboard

Add KD distributed recipe

Open lindawangg opened this issue 1 year ago • 2 comments

Context

What is the purpose of this PR? Is it to

  • [x] add a new feature
  • [ ] fix a bug
  • [ ] update tests and/or documentation
  • [ ] other (please add here)

To enable distributed training for knowledge distillation.

Changelog

What are the changes made in this PR?

  • Builds on top of https://github.com/pytorch/torchtune/pull/1539
  • KD distributed recipe (knowledge_distillation_distributed.py) is similar to lora_finetune_distributed.py.
  • KD config: knowledge_distillation_distributed.yaml

Test plan

Please make sure to do each of the following if applicable to your PR. If you're unsure about any one of these just ask and we will happily help. We also have a contributing page for some guidance on contributing.

  • [x] run pre-commit hooks and linters (make sure you've first installed via pre-commit install)
  • [x] add unit tests for any new functionality
  • [x] update docstrings for any new or updated methods or classes
  • [x] run unit tests via pytest tests
  • [x] run recipe tests via pytest tests -m integration_test
  • [x] manually run any new or modified recipes with sufficient proof of correctness
  • [x] include relevant commands and any other artifacts in this summary (pastes of loss curves, eval results, etc.)
tune run --nodes 1 --nproc_per_node 2 knowledge_distillation_distributed --config qwen2/knowledge_distillation_distributed

(left) single device (right) distributed, can also increase batch size imageimage Similar eval results image

UX

If your function changed a public API, please add a dummy example of what the user experience will look like when calling it. Here is a docstring example and a tutorial example

  • [ ] I did not change any public API
  • [x] I have added an example to docs or docstrings

lindawangg avatar Sep 20 '24 00:09 lindawangg

:link: Helpful Links

:test_tube: See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/torchtune/1631

Note: Links to docs will display an error until the docs builds have been completed.

:white_check_mark: No Failures

As of commit cf5f01a062856a6fe135102e68698cae907da31a with merge base d3039da900dbc5712ecb1f660ccf471cdcf8d107 (image): :green_heart: Looks good so far! There are no failures yet. :green_heart:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

pytorch-bot[bot] avatar Sep 20 '24 00:09 pytorch-bot[bot]

Hey @lindawangg , thanks for the recipe!! We have been a bit busy, but we will get to this PR.

felipemello1 avatar Sep 24 '24 00:09 felipemello1

Codecov Report

Attention: Patch coverage is 5.65111% with 384 lines in your changes missing coverage. Please review.

Project coverage is 68.70%. Comparing base (23c8829) to head (cf5f01a). Report is 386 commits behind head on main.

Files with missing lines Patch % Lines
recipes/knowledge_distillation_distributed.py 0.00% 313 Missing :warning:
...recipes/test_knowledge_distillation_distributed.py 24.46% 71 Missing :warning:
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1631      +/-   ##
==========================================
- Coverage   70.44%   68.70%   -1.74%     
==========================================
  Files         308      306       -2     
  Lines       16270    16596     +326     
==========================================
- Hits        11462    11403      -59     
- Misses       4808     5193     +385     

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

:rocket: New features to boost your workflow:
  • :snowflake: Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

codecov-commenter avatar Oct 26 '24 00:10 codecov-commenter