DeepPhonemizer icon indicating copy to clipboard operation
DeepPhonemizer copied to clipboard

Optimizer is None when trying to finetune a pretrained model

Open inigo-casanueva opened this issue 3 years ago • 1 comments

Thanks for the repo!

When trying to finetune one of the provided pretrained models, I was getting an unintuitive error. This was because the models were saved without optimizer and when trying to load the checkpoint, in line 76 in training/trainer.py, the check wouldnt stop it from loading the optimizer as checkpoint['optimizer'] existed in the dict with None value

optimizer = Adam(model.parameters())
if 'optimizer' in checkpoint:
    optimizer.load_state_dict(checkpoint['optimizer'])
for g in optimizer.param_groups:
    g['lr'] = config['training']['learning_rate']

changing the line to if 'optimizer' in checkpoint and checkpoint['optimizer']: should fix it.

inigo-casanueva avatar Jan 13 '23 11:01 inigo-casanueva

Hi, thanks for the hint. I will update this if I have time :)

cschaefer26 avatar Feb 23 '23 12:02 cschaefer26