PizzaSlice-cmd
Results
2
comments of
PizzaSlice-cmd
4070 super here and when selecting e4m3fn in the load diffusion model node, i'm being manually casted to bfloat16 also torch 2.6.0 CU124 python 3.10
ComfyAnonymous confirmed this is normal to be casted to bf16 while using model weight dtype torch.float8_e4m3fn. #6913 > yes, if you want to enable fp8 matrix multiplication you can use...