Sunyata
Results
1
comments of
Sunyata
> ComfyAnonymous confirmed this is normal to be casted to bf16 while using model weight dtype torch.float8_e4m3fn. > > [#6913](https://github.com/comfyanonymous/ComfyUI/issues/6913) > > > yes, if you want to enable fp8...