tch-rs
tch-rs copied to clipboard
Saving optimizer state
It doesn't seem like optimizers' internal states are stored in VarStore
. I also don't find any public interfaces for explicitly accessing / saving optimizers' internal states. So, is it possible to save and reload training state without losing optimizers' internal state?
I've taken a look into this. From the C++ API, I see this torch::save
function template that accepts torch::optim::Optimizer
as input, but it's not present in torch-sys
API, probably since the generator does not generate it. We'll first need to add it to torch-sys/libtch/torch_api.{cpp,h}
, and expose it in torch-sys/src/lib.rs
, in order for tch
to use it. It's similar to how torch_sys::at_save
is done.