transformers
transformers copied to clipboard
Remove redundant backend checks in training_args.py
What does this PR do?
Fixes #28109
Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the contributor guideline, Pull Request section?
- [x] Was this discussed/approved via a Github issue or the forum? Please add a link to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the documentation guidelines, and here are tips on formatting docstrings.
- [ ] Did you write any new necessary tests?
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR.
@ArthurZucker @muellerzr
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.
Thanks @kevint324! Overall this looks fine but let's fix the failing quality checks!
pip install -e .[quality]; make fixup;
@muellerzr
Done. After make fixup; a extra import is removed.
Thanks
I looked into the failed case and found the casue.
In my PR I removed the calls to self.device within the if statements which calls _setup_devices to do some initialization.
I've no idea why test_offline.py : test_offline_mode still fails.
I tried it on main and it still failed. @ArthurZucker
Failing tests are independent of the changes, merging