pytorch-lightning icon indicating copy to clipboard operation
pytorch-lightning copied to clipboard

Checkpointing primitives for Lite

Open awaelchli opened this issue 2 years ago • 4 comments

🚀 Feature

Add Lite.save_checkpoint and Lite.load_checkpoint convenience methods.

Motivation

It is cumbersome to manually construct a checkpoint dict with all metadata and states.

Pitch

Saving checkpoints:

ckpt = self.create_checkpoint(model1, model2, ..., optimizer1, optimizer2, ..., key1=value1, key2=value2)
self.save(ckpt, "path/to/ckpt.pt")
  1. Creates a dict and fetches the state dicts of all objects passed in (instance check)
  2. Depending on strategy, consolidates optimizer state etc.
  3. User-defined metadata can be passed in
  4. Adds version information
  5. The checkpoint creation and saving is separated to give the user control to modify contents if they need to

Loading checkpoints:

ckpt = self.load("path/to/ckpt.pt")
self.apply_checkpoint(ckpt, model1, model2, ..., optimizer1, optimizer2)

# if you need to access your metadata:
val1 = ckpt["key1"]
  1. User loads the file
  2. User has model and optimizers instantiated
  3. Applies the checkpoint to the objects. The state dict contents get applied to the objects (model, optimizers, etc.)
  4. The checkpoint loading from file and application to models is separated to give the user control to modify contents before they get loaded

Open questions

  • I'm not sure whether the arguments should be keyword only.

Alternatives

The current way. Constructing the dicts manually and saving/loading using the self.save/self.load helpers.


If you enjoy Lightning, check out our other projects! ⚡

  • Metrics: Machine learning metrics for distributed, scalable PyTorch applications.

  • Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.

  • Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.

  • Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.

  • Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging PyTorch Lightning, Transformers, and Hydra.

awaelchli avatar Sep 20 '22 23:09 awaelchli

You might find this library useful for such primitives, especially to support distributed checkpointing: https://github.com/pytorch/torchsnapshot

@yifuwang

ananthsub avatar Sep 20 '22 23:09 ananthsub

Hey @ananthsub @yifuwang Would you be interested in making a contribution to Lite ?

tchaton avatar Sep 21 '22 07:09 tchaton

@ananthsub Thanks, yes we already saw it and the interface is really nice. It could be useful here too to be called under the hood.

In Lite, we also have the CheckpointIO (attached to the strategies) which takes care of the saving and loading, but state dict collection on the objects happens separately. Since torchsnapshot does both, it would have to be integrated differently there.

awaelchli avatar Sep 21 '22 10:09 awaelchli

Please, let's keep the torchsnapshot integration focused to #14503. It's in our roadmap, just waiting for Lite changes to be over.

carmocca avatar Sep 21 '22 11:09 carmocca