ComfyUI icon indicating copy to clipboard operation
ComfyUI copied to clipboard

Why can't I use my Lora after the update?

Open wuyougg opened this issue 1 year ago • 3 comments

Why can't I use my Lora after the update?

wuyougg avatar Aug 11 '24 10:08 wuyougg

There is no information in your description of the phenomenon, so we cannot diagnose.

Please provide details on which LoRA model you are using and upload the terminal log (the full log including version information).

ltdrdata avatar Aug 11 '24 11:08 ltdrdata

got prompt Requested to load BaseModel Loading 1 new model loaded in lowvram mode 64.0 Requested to load AutoencoderKL Loading 1 new model loaded in lowvram mode 64.0

I use tcd sd1.5 Lora

wuyougg avatar Aug 12 '24 00:08 wuyougg

The reason I requested the full log was to check your environment and examine any errors that occurred earlier. Your report contains no information about the error.

ltdrdata avatar Aug 12 '24 00:08 ltdrdata

I'm not sure but this person might be seeing the same issue I have been after I tried to update to a new version beyond 0.0.4

I have some more details in this thread because I thought it was a core sampler issue, but it appears to be something with the Lora loaders. - https://github.com/comfyanonymous/ComfyUI/issues/4271#issuecomment-2285055709

Essentially I'm unable to do any batch processing as it appears that using Loras causes the memory to overrun, resulting in very poor performance, lowvram mode kicking in, giving worse performance, and eventually a total crash of comfy and/or the GPU.

Happy to help troubleshoot and provide more details in this thread since it's not related to Flux and is affecting all my SDXL worfkflows that use Loras.

Kinglord avatar Aug 13 '24 02:08 Kinglord

Suddenly unable to use Loras at all, as well. I was previously able to do --highvram in both fp16 and fp8, with loras and hires fix.

I can't even do fp8 anymore, something has gotten very messed up again. I can't use comfyui at all right now... 3090 with 24GB VRAM + 16GB RAM. Normal VRAM and Low VRAM don't help, either.

I can literally train a Lora off my computer without issue, yet I can't even use Comfyui to make a creation with it anymore. Something is quite broken with Loras or Lora loaders in there, that's my best guess.

BigBanje avatar Aug 13 '24 14:08 BigBanje

Use the release standalone package on the readme or downgrade pytorch to 2.3.1

comfyanonymous avatar Aug 13 '24 16:08 comfyanonymous

Thanks, I solved the problem by installing a new standalone package

wuyougg avatar Aug 13 '24 23:08 wuyougg