tutorials
tutorials copied to clipboard
Improve LR scheduler documentation in transfer learning tutorial
Description
This PR improves the documentation and comments related to the learning rate scheduler (StepLR) in the transfer_learning_tutorial.py script.
Changes made:
- Added clarification about the recommended order:
optimizer.step()followed byscheduler.step(). - Explained why StepLR is stepped once per epoch in this tutorial.
- Improved clarity of the backward/optimizer comment inside the training loop.
- Updated comments to align with current PyTorch best practices.
No functional changes were made. This is a documentation only improvement.
Checklist
- [x] The issue being addressed is documentation clarity (no existing issue number)
- [x] Only one change area is addressed in this PR
- [x] No unnecessary changes are included
- [x] PR follows PyTorch Tutorials contribution guidelines