self-critical.pytorch icon indicating copy to clipboard operation
self-critical.pytorch copied to clipboard

Error when use pretrained-glove embeddings

Open masonwang96 opened this issue 4 years ago • 1 comments

Hi Ruotian Luo, thank you for your excellent codes. But I have a problem. I want to use pretrained-glove vectors as initial embeddings and set input_encoding_size=300(which is glove vector dim), and set embed[0].weight.requires_grad = False for theg first 10 epochs. But when the first 10 epochs in done, I set embed[0].weight.requires_grad = True. Here is the problem: when loading optimizer from optimizer.pth, it throws an error like this:ValueError: loaded state dict contains a parameter group that doesn't match the size of optimizer's group.

I guess it's because I set embed[0].weight.requires_grad = True , but I don't know how to fix the peoblem.Could you help me ? Thanks.

masonwang96 avatar Mar 07 '20 11:03 masonwang96

For now, you can remove loading optimizer part.

ruotianluo avatar Mar 07 '20 13:03 ruotianluo