transformers
transformers copied to clipboard
Update trainer_utils.py
This modified version of the function includes a check for whether the output of function has a learning rate scheduler that needs to be updated based on the current batch size. If so, it updates the num_batches
attribute of the scheduler to ensure that the learning rate is adjusted correctly.
What does this PR do?
it can be one solution for the lr_scheduler not updated when auto_find_batch_size set to True and batch_size decays #21521 problem.
Fixes # (issue)
Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [+] Did you read the contributor guideline, Pull Request section?
- [x] Was this discussed/approved via a Github issue or the forum? Please add a link to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the documentation guidelines, and here are tips on formatting docstrings.
- [ ] Did you write any new necessary tests?
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR.
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint.
cc @muellerzr
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
@muellerzr Could you have a look here?
@mzamini92 could you rebase so we can double check no tests are breaking with this and we can merge? Thanks!
@mzamini92 could you rebase so we can double check no tests are breaking with this and we can merge? Thanks!
@muellerzr Thanks for reaching me. I did it based on Sylvain suggestion. please double check and I will revise if needed.