Fix LR scheduler behaviour with AMP
What does this PR do?
When training when native AMP and a LR scheduler, we get this warning that indicates that a LR step has been taken when an optimizer step was skipped (expected at the beginning of the training with native AMP):
/usr/local/lib/python3.8/dist-packages/torch/optim/lr_scheduler.py:138: UserWarning: Detected call of `lr_scheduler.step()` before `optimizer.step()`. In PyTorch 1.1.0 and later, you should call them in the opposite order: `optimizer.step()` before `lr_scheduler.step()`. Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate
Fixes #16228 #5558
Does your PR introduce any breaking changes? If yes, please list them.
No
Before submitting
- [x] Was this discussed/approved via a GitHub issue? (not for typos and docs)
- [x] Did you read the contributor guideline, Pull Request section?
- [x] Did you make sure your PR does only one thing, instead of bundling different changes together?
- [ ] Did you make sure to update the documentation with your changes? (if necessary)
- [ ] Did you write any new necessary tests? (not for typos and docs)
- [ ] Did you verify new and existing tests pass locally with your changes?
- [ ] Did you list all the breaking changes introduced by this pull request?
- [ ] Did you update the CHANGELOG? (not for typos, docs, test updates, or minor internal changes/refactors)
PR review
Anyone in the community is welcome to review the PR. Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:
- [x] Is this pull request ready for review? (if not, please submit in draft mode)
- [x] Check that all items from Before submitting are resolved
- [x] Make sure the title is self-explanatory and the description concisely explains the PR
- [x] Add labels and milestones (and optionally projects) to the PR so it can be classified
In the process of fixing tests I discovered and fixed a bug where the scheduler wouldn't match its optimizer when multiple optimizers are instantiated with frequencies. Now the optimizers and schedulers match and alternate as they should, resetting the cycle every epoch.
@carmocca Ready for final review
The way I fixed the tests/tests_pytorch/models/test_hooks.py::test_trainer_model_hook_system_fit[True-kwargs1] tests is very flaky, so I'd appreciate if someone more familiar with these tests comes up with a better fix.
edit: seems like it didn't even fix it...
I also modified native_amp.py in both lightning_fabric and pytorch_lightning. It doesn't seem like the lightning_fabric one is called for a typical workflow, so I don't know if it's the right way
Hi @Borda, I also encounter the same issue. Will this be merged?
Hi @Borda, I also encounter the same issue. Will this be merged?
Let me check what is missing here...
Is this PR merged already? I'm still having this issue.
there were some failing tests, @milesial mind have a look?
⚠️ GitGuardian has uncovered 2 secrets following the scan of your pull request.
Please consider investigating the findings and remediating the incidents. Failure to do so may lead to compromising the associated services or software components.
🔎 Detected hardcoded secrets in your pull request
| GitGuardian id | GitGuardian status | Secret | Commit | Filename | |
|---|---|---|---|---|---|
| - | Generic High Entropy Secret | 78fa3afdfbf964c19b4b2d36b91560698aa83178 | tests/tests_app/utilities/test_login.py | View secret | |
| - | Base64 Basic Authentication | 78fa3afdfbf964c19b4b2d36b91560698aa83178 | tests/tests_app/utilities/test_login.py | View secret |
🛠 Guidelines to remediate hardcoded secrets
- Understand the implications of revoking this secret by investigating where it is used in your code.
- Replace and store your secret safely. Learn here the best practices.
- Revoke and rotate this secret.
- If possible, rewrite git history. Rewriting git history is not a trivial act. You might completely break other contributing developers' workflow and you risk accidentally deleting legitimate data.
To avoid such incidents in the future consider
- following these best practices for managing and storing secrets including API keys and other credentials
- install secret detection on pre-commit to catch secret before it leaves your machine and ease remediation.
🦉 GitGuardian detects secrets in your source code to help developers and security teams secure the modern development process. You are seeing this because you or someone else with access to this repository has authorized GitGuardian to scan your pull request.
Our GitHub checks need improvements? Share your feedbacks!
Codecov Report
Merging #16229 (965fc03) into master (6497e36) will decrease coverage by
54%. Report is 1 commits behind head on master. The diff coverage is29%.
Additional details and impacted files
@@ Coverage Diff @@
## master #16229 +/- ##
==========================================
- Coverage 83% 29% -54%
==========================================
Files 450 442 -8
Lines 38089 37941 -148
==========================================
- Hits 31803 11015 -20788
- Misses 6286 26926 +20640