MOCHI problem: "model weight dtype torch.bfloat16, manual cast: torch.float32" + "unet unexpected: ['scaled_fp8']" DESPITE USING ALL THE RIGHT MODELS.
Your question
Hi, As you can see in the following screenshot, I am using all the right models for fp8 (scaled etc)
Yet my comfy (fresh install), keeps doing a manual cast to 32 or whatever, and it says it was expecting scaled fp8 (which I loaded yet it says it's not there???)
What could be the problem? :) Thanks
Other
Current usage of VRAM (despite very SLOW iterations):
https://github.com/kijai/ComfyUI-MochiWrapper Have you tried with these nodes?
This issue is being marked stale because it has not had any activity for 30 days. Reply below within 7 days if your issue still isn't solved, and it will be left open. Otherwise, the issue will be closed automatically.