Banje
Banje
This issue still persists. Please bring back functionality for this, it was a great feature.
Suddenly unable to use Loras at all, as well. I was previously able to do --highvram in both fp16 and fp8, with loras and hires fix. I can't even do...
Same thing here. I reinstalled the standalone from the read-me, reinstalled pytorch, still eats all my VRAM and causes comfyui to crash after a couple generations every time.
I have a 3090 with 16GB of system RAM. I run Flux Dev with fp16, with one Lora at a time using the normal vram mode. I still have an...
My 3090 can no longer do anything with Flux, as I experience extreme bottlenecking after some recent update to comfyui. I can't even use fp8 anymore... I was previously using...