xtuner icon indicating copy to clipboard operation
xtuner copied to clipboard

shape mismatch when loading llava-phi path

Open shockjiang opened this issue 9 months ago • 2 comments

I try to load pretrained pth of llava: hub/llava-phi-3-mini-pth/model.pth. And I got this strange error:

  • used deepspeed zerospeed3 and flash-attn.
RuntimeError: Error(s) in loading state_dict for LLaVAModel:
	size mismatch for llm.model.embed_tokens.weight: copying a param with shape torch.Size([32064, 3072]) from checkpoint, the shape in current model is torch.Size([0]).
	size mismatch for llm.model.layers.0.self_attn.o_proj.weight: copying a param with shape torch.Size([3072, 3072]) from checkpoint, the shape in current model is torch.Size([0]).

any clue? thx!

shockjiang avatar May 13 '24 01:05 shockjiang

The issue might be due to the local model not being initialized correctly.

Before loading the checkpoint, check if the model contains the key llm.model.layers.0.self_attn.o_proj.weight.

pppppM avatar May 14 '24 06:05 pppppM

@shockjiang Can you try this in https://github.com/InternLM/xtuner/blob/main/xtuner/configs/deepspeed/deepspeed_zero3.json ?

{"zero_optimization": {
  "stage3_prefetch_bucket_size":0}
}

hhaAndroid avatar May 16 '24 03:05 hhaAndroid