City

Results 125 comments of City

From your error: ``` File "C:\AI Stuff\ComfyUI\ComfyUI\comfy\model_base.py", line 97, in __init__ logging.info("model weight dtype {}, manual cast: {}".format(self.get_dtype(), self.manual_cast_dtype)) ``` On the latest ComfyUI that [line is completely different](https://github.com/comfyanonymous/ComfyUI/blob/master/comfy/model_base.py#L97) i.e....

Yeah, not sure what could cause that. What do you get when you run `git status` in the main ComfyUI folder (I assume `C:\AI Stuff\ComfyUI\ComfyUI\`) Edit: also try `update_comfyui.bat` if...

@pallavnawani It looks like your ExtraModels repo isn't on the latest version. That line [no longer has the current_device option there](https://github.com/city96/ComfyUI_ExtraModels/blob/193c2fd9d3db5f49570350285eb77e4d34878001/PixArt/nodes.py#L29-L32)

@doogyhatts should be good on both versions now, do a git pull on the extra models repo.

are you running python with the `-s` flag? Sometimes it picks up packages installed to the system env without that for some reason For embedded: ``` .\python_embeded\python.exe -s -m pip...

Did you update the node/comfyui to latest? This should be fixed. The only time I can think of that this would happen is if you see "loading in lowvram mode"...

@RYG81 That is a completely different issue, and is most likely due to selecting the incorrect model (in this case, having alpha selected instead of sigma in the checkpoint loader...

Pushed a change that might fix this a while ago, may be worth testing on the latest update.

Hunyuan DiT is now officially supported in ComfyUI including LoRAs, you should use that implementation instead. https://comfyanonymous.github.io/ComfyUI_examples/hunyuan_dit/

Update and try again? I suspect it will still fail on low vram machines but maybe it'll work.