[BUG]:chatgpt inference still ERROR after fix
🐛 Describe the bug
still error:
size mismatch for transformer.ln_f.weight: copying a param with shape torch.Size([768]) from checkpoint, the shape in current model is torch.Size([64]). size mismatch for transformer.ln_f.bias: copying a param with shape torch.Size([768]) from checkpoint, the shape in current model is torch.Size([64]). size mismatch for lm_head.weight: copying a param with shape torch.Size([50257, 768]) from checkpoint, the shape in current model is torch.Size([250880, 64]).
params: parser.add_argument('--model', default='bloom', choices=['gpt2', 'bloom', 'opt']) parser.add_argument('--pretrain', type=str, default='./actor_checkpoint_prompts.pt')
Environment
No response
File "inference.py", line 56, in
Thanks for your feedback, can you show your command line code please?
python inference.py --pretrain ./actor_checkpoint_prompts.pt --model bloom
if i use gpt2 to train , i get right results, bloom is not right any way.