Multi-hopComplexKBQA
Multi-hopComplexKBQA copied to clipboard
size mismatch for ranker.embeddings.token_type_embeddings.weight
trackback(堆栈信息):
File "code/KBQA_Runner.py", line 819, in <module>
main()
File "code/KBQA_Runner.py", line 581, in main
policy.load_state_dict(model_dic, strict=False)
File "/home/lujincheng/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 839, in load_state_dict
self.__class__.__name__, "\n\t".join(error_msgs)))
RuntimeError: Error(s) in loading state_dict for Policy:
size mismatch for ranker.embeddings.token_type_embeddings.weight: copying a param with shape torch.Size([5, 768]) from checkpoint, the shape in current model is torch.Size([3, 768]).
environment(环境):
python3.6, torch==1.3.0
use model/data: CQ, all from your google drives
oh I get it, change "type_vocab_size" from 3 to 5(in /config/bert_config.json)