accelerate
accelerate copied to clipboard
How to save the optimizer state while enabling Deepspeed to save the model
System Info
Unrelated to configuration
Information
- [X] The official example scripts
- [ ] My own modified scripts
Tasks
- [ ] One of the scripts in the examples/ folder of Accelerate or an officially supported
no_trainerscript in theexamplesfolder of thetransformersrepo (such asrun_no_trainer_glue.py) - [X] My own task or dataset (give details below)
Reproduction
unwrapped_model = accelerator.unwrap_model(transformer)
unwrapped_model.save_pretrained(save_directory,
save_function=accelerator.save,
state_dict=accelerator.get_state_dict(transformer))
I am using Deepspeed Zero2.
I want to save the model state and optimizer state, but the current save_pretrained() only supports saving the model state. How can I save the optimizer state?
Expected behavior
I would like to know if it supports saving optimizer state and how to use it.
THANKS!