pytorch-lightning
pytorch-lightning copied to clipboard
Current FSDPPrecision does not support custom scaler for 16-mixed precision
Bug description
self.precision here inherits from parent class Precision, so it is always "32-true"
The subsequent definition of self.scaler also assigns None if the scaler is not None and precision == "16-mixed".
Is this intentional or a bug?
What version are you seeing the problem on?
v2.2
How to reproduce the bug
No response
Error messages and logs
# Error messages and logs here please
Environment
Current environment
#- Lightning Component (e.g. Trainer, LightningModule, LightningApp, LightningWork, LightningFlow):
#- PyTorch Lightning Version (e.g., 1.5.0):
#- Lightning App Version (e.g., 0.5.2):
#- PyTorch Version (e.g., 2.0):
#- Python version (e.g., 3.9):
#- OS (e.g., Linux):
#- CUDA/cuDNN version:
#- GPU models and configuration:
#- How you installed Lightning(`conda`, `pip`, source):
#- Running environment of LightningApp (e.g. local, cloud):
More info
No response