transformers
transformers copied to clipboard
[Fix Bugs] Fix keys in `_load_pretrained_model`
What does this PR do?
Fixes the bug when _load_pretrained_model.
f'{prefix}.key' is wrong because the variable key is not used is this branch case.
And this bug will lead to load some models failed like BLOOM-176B.
Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the contributor guideline, Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the forum? Please add a link to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the documentation guidelines, and here are tips on formatting docstrings.
- [ ] Did you write any new necessary tests?
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR.
The documentation is not available anymore as the PR was closed or merged.
cc @sgugger