LLaMA-LoRA-Tuner icon indicating copy to clipboard operation
LLaMA-LoRA-Tuner copied to clipboard

Error when fine tuning: Size Mismatch

Open Lev-Stambler opened this issue 1 year ago • 0 comments

Hello, I am trying to fine tune and start with Alpaca 7B as a base model. I am getting an error message of "Size Mismatch though"

Would anyone know where this comes from?

Here is the error:

RuntimeError: Error(s) in loading state_dict for PeftModelForCausalLM:
	size mismatch for base_model.model.model.layers.0.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.0.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.0.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.0.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.1.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.1.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.1.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.1.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.2.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.2.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.2.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.2.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.3.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.3.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.3.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.3.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.4.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.4.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.4.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.4.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.5.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.5.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.5.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.5.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.6.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.6.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.6.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.6.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.7.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.7.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.7.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.7.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.8.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.8.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.8.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.8.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.9.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.9.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.9.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.9.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.10.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.10.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.10.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.10.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.11.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.11.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.11.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.11.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.12.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.12.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.12.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.12.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.13.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.13.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.13.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.13.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.14.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.14.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.14.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.14.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.15.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.15.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.15.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.15.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.16.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.16.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.16.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.16.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.17.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.17.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.17.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.17.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.18.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.18.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.18.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.18.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.19.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.19.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.19.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.19.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.20.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.20.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.20.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.20.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.21.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.21.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.21.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.21.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.22.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.22.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.22.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.22.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.23.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.23.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.23.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.23.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.24.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.24.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.24.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.24.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.25.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.25.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.25.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.25.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.26.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.26.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.26.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.26.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.27.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.27.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.27.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.27.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.28.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.28.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.28.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.28.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.29.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.29.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.29.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.29.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.30.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.30.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.30.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.30.self_attn.v_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.31.self_attn.q_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).
	size mismatch for base_model.model.model.layers.31.self_attn.q_proj.lora_B.weight: copying a param with shape torch.Size([4096, 16]) from checkpoint, the shape in current model is torch.Size([4096, 8]).
	size mismatch for base_model.model.model.layers.31.self_attn.v_proj.lora_A.weight: copying a param with shape torch.Size([16, 4096]) from checkpoint, the shape in current model is torch.Size([8, 4096]).

Lev-Stambler avatar Jun 12 '23 17:06 Lev-Stambler