Lora-for-Diffusers icon indicating copy to clipboard operation
Lora-for-Diffusers copied to clipboard

size mismatch using the converted .bin file

Open XiaoyuShi97 opened this issue 1 year ago • 0 comments

Hi, thanks a lot for your great work. I am converting LoRA file of safetensor format downloaded from civitai using your format_convert.py. Then I load the converted .bin file using pipe.unet.load_attn_procs. But I get following error:

RuntimeError: Errors in loading state_dict for LoRACrossAttnProcessor:

size mismatch for to_q_lora.down.weight: copying a param with shape torch.size(128,320) from checkpoint, the shape in current model is torch.size(4, 320).

It seems to be related to the config of unet's attn processor, but I could not find corresponding documents. Could you please provides some suggestions?

XiaoyuShi97 avatar Sep 05 '23 09:09 XiaoyuShi97