Is it possible to continue learning from any epoch?
Hey! Thanks for your work! I have a question: can I continue training the model from the epoch before which the model was trained?
I mean, I started training from scratch, after the training was interrupted, and after a while, can I continue training (maybe using existed .pkl file)?
I'm trying to connect mobilenet_v2 as a backbone, but it needs distillation and experimentation to improve. But the training process is long, so I want to be able to train up to epoch 10 (for example), and after other experiments, continue training from epoch 10 (the same model and with the same source code as before)
@ArtiX-GP Sorry for the late reply. I don't get it.
To fine-tune an existing model, please set, for example, in v1.yml.
https://github.com/Arthur151/ROMP/blob/7b4734270672e602f4fc5659d37479478e72b78b/configs/v1.yml#L18
fine_tune: True
model_path: /path/to/model.pkl