accelerate
accelerate copied to clipboard
Is it possible to checkpoint scalar value?
From accelerate document:
"By using register_for_checkpointing(), you can register custom objects to be automatically stored or loaded from the two prior functions, so long as the object has a state_dict and a load_state_dict functionality. This could include objects such as a learning rate scheduler."
Is it possible to include scalar value in a checkpoint just like pytorch? such as epoch and step, etc. if so, redundant lines to recover epoch can be avoided.
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
Sure we can absolutely. If you'd like to expand our checkpointing example here in accelerate implementing that, we can look at upstreaming it further 🤗
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.