parseq
parseq copied to clipboard
Change train hyperparameters
I want to save trained weight during testing by changing hyperparameters so that I can use the best testing hyperparameters during production without any parser argument. is it possible to save weight with new test parameters without further training?
If you're referring to PARSeq's runtime parameters decode_ar and refine_iters, then yes you may modify them without retraining the model.
The hyperparameters are stored in the hparams attribute (e.g. model.hparams). You can use it like a dict object and override the parameter of choice, e.g. model.hparams['decode_ar'] = False. Use torch.save(model, 'model.pt') to save the weights and hyperparameters.