TinyLlama
TinyLlama copied to clipboard
inconsistent size when I use the huggingface model
size mismatch for model.layers.0.self_attn.k_proj.weight: copying a param with shape torch.Size([256, 2048]) from checkpoint, the shape in current model is torch.Size([2048, 2048]).
there are error when I use
model = AutoModelForCausalLM.from_pretrained(path)
load https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
can anyone help me ?