openfold
openfold copied to clipboard
Beware of DeepSpeed LR schedulers
trafficstars
PyTorch Lightning and DeepSpeed LR schedulers don't interact correctly at the moment. Follow the PL issue for updates. In the meantime, use configure_optimizers in train_openfold to add LR scheduling logic.
Just commenting for others that may come across this: you need to remove/omit the optimizer and scheduler from the deepspeed config and redefine them in configure_optimizers for this to work.