transformers
transformers copied to clipboard
Add a check that warmup_setps is either 0 or >= 1
What does this PR do?
Update training_args.py to add a check that warmup_setps is either 0 or >= 1. Otherwise, raise an error.
Before submitting
-
[ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
-
[x] Did you read the contributor guideline, Pull Request section?
-
[ ] Was this discussed/approved via a Github issue or the forum? Please add a link to it if that's the case.
-
[ ] Did you make sure to update the documentation with your changes? Here are the documentation guidelines, and here are tips on formatting docstrings.
-
[ ] Did you write any new necessary tests?
-
trainer: @muellerzr and @pacman100