ComfyUI_ExtraModels icon indicating copy to clipboard operation
ComfyUI_ExtraModels copied to clipboard

Question: does the pixart nodes support fp8 storage type?

Open SLAPaper opened this issue 1 year ago • 1 comments

the fp8 storage type is introduced in https://github.com/comfyanonymous/ComfyUI/issues/2157 which significantly reduce vram usage, so I wonder whether pixart models have supported yet?

SLAPaper avatar Jan 24 '24 17:01 SLAPaper

I remember trying it but couldn't get it to work with the diffusers provided T5 model. I'd have to somehow run the inference in 32bit/16bit while keeping the weights in the 8bit format. That'd involve re-implementing T5 from scratch, I think?

As for the model itself, that most likely doesn't need it since it's only around ~1.2 GB in FP16.

city96 avatar Jan 24 '24 22:01 city96