MWPToolkit
MWPToolkit copied to clipboard
The testing problems of trained model
The dimensions of the input data are shown to be different from the dimensions trained in the model when I test with the trained model. Why is this happening?
Error: RuntimeError: Error(s) in loading state_dict for GTS: size mismatch for embedder.embedder.weight: copying a param with shape torch.Size([3349, 128]) from checkpoint, the shape in current model is torch.Size([3322, 128]).
Hi, I had the same error. In my case it was because I tried to load and test a model previously trained on a different dataset (actually a different trainset). This creates the mismatch in the embedding vocab sizes (as in your error message), since the embeddings are created at the initialization of model. See this line.