albert_pytorch
albert_pytorch copied to clipboard
RuntimeError: Error(s) in loading state_dict for BertModel
RuntimeError: Error(s) in loading state_dict for BertModel: size mismatch for bert.embeddings.word_embeddings.weight: copying a param with shape torch.Size([21128, 128]) from checkpoint, the shape in current model is torch.Size([21128, 312]).
Can anyone help with this issue? I used the convert file but it seems there is some problem with the param size in checkpoint? Thank u
I met the same problem when applying albert_tiny_zh in pytorch, have you solved the problem? Forword to ur reply. ^_^