self-critical.pytorch
self-critical.pytorch copied to clipboard
Error when use pretrained-glove embeddings
Hi Ruotian Luo, thank you for your excellent codes.
But I have a problem. I want to use pretrained-glove vectors as initial embeddings and set input_encoding_size=300(which is glove vector dim), and set embed[0].weight.requires_grad = False for theg first 10 epochs. But when the first 10 epochs in done, I set embed[0].weight.requires_grad = True. Here is the problem: when loading optimizer from optimizer.pth, it throws an error like this:ValueError: loaded state dict contains a parameter group that doesn't match the size of optimizer's group
.
I guess it's because I set embed[0].weight.requires_grad = True
, but I don't know how to fix the peoblem.Could you help me ? Thanks.
For now, you can remove loading optimizer part.