parseq icon indicating copy to clipboard operation
parseq copied to clipboard

Change train hyperparameters

Open WongVi opened this issue 3 years ago • 1 comments

I want to save trained weight during testing by changing hyperparameters so that I can use the best testing hyperparameters during production without any parser argument. is it possible to save weight with new test parameters without further training?

WongVi avatar Aug 22 '22 09:08 WongVi

If you're referring to PARSeq's runtime parameters decode_ar and refine_iters, then yes you may modify them without retraining the model.

The hyperparameters are stored in the hparams attribute (e.g. model.hparams). You can use it like a dict object and override the parameter of choice, e.g. model.hparams['decode_ar'] = False. Use torch.save(model, 'model.pt') to save the weights and hyperparameters.

baudm avatar Aug 22 '22 09:08 baudm