comfyanonymous
comfyanonymous
loras still seem to work fine, they might just be weaker when weights are in fp8 so you might have to bump up the strength a bit.
Can you try with 16bit weights to see if it works?
Should be fixed if you update the standalone.
If anyone is getting this and isn't using the standalone the fix is to go in your: site-packages\torch\lib\ folder and copy: libiomp5md.dll to libomp140.x86_64.dll You can also run ComfyUI once...
How do they compare to stable diffusion?
Can you test this properly? The endpoint doesn't work after I run a regular workflow.
The reason I don't enable fp16 in lumina2 is because the neta yume 3.5 model breaks with fp16 + my clamping. It also breaks using the downscaling in this PR.
Post the full log, if you don't your report is completely useless.
If you want it fixed post your full logs and workflow.
Are you using the bf16 weights?