Lora-for-Diffusers
Lora-for-Diffusers copied to clipboard
size mismatch using the converted .bin file
Hi, thanks a lot for your great work. I am converting LoRA file of safetensor format downloaded from civitai using your format_convert.py. Then I load the converted .bin file using pipe.unet.load_attn_procs. But I get following error:
RuntimeError: Errors in loading state_dict for LoRACrossAttnProcessor:
size mismatch for to_q_lora.down.weight: copying a param with shape torch.size(128,320) from checkpoint, the shape in current model is torch.size(4, 320).
It seems to be related to the config of unet's attn processor, but I could not find corresponding documents. Could you please provides some suggestions?