[BugFix]: KeyError: 'Adafactor is already registered in optimizer at…
… torch.optim'
Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers. By the way, if you're not familiar with how to use pre-commit to fix lint issues or add unit tests, please refer to Contributing to OpenMMLab.
Motivation
For solving bug :
Modification
While optimizers are registered, we force to update the optimizer from transformers library because this was the one used prior to same optimizer was introduced by PyTorch
BC-breaking (Optional)
Does the modification introduce changes that break the backward-compatibility of the downstream repos? Choice was made to force Adafactor from transformers to be used. Hence changes are backward-compatible.
Use cases (Optional)
NA
Checklist
- Pre-commit or other linting tools are used to fix the potential lint issues.
- The modification is covered by complete unit tests. If not, please add more unit test to ensure the correctness.
- If the modification has potential influence on downstream projects, this PR should be tested with downstream projects, like MMDetection or MMPretrain.
- The documentation has been modified accordingly, like docstring or example tutorials.
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.