City
City
Thanks, I'll look into it when I get the time. A lot of the memory management stuff is done by ComfyUI, so those parts might not be entirely applicable.
I remember trying it but couldn't get it to work with the diffusers provided T5 model. I'd have to somehow run the inference in 32bit/16bit while keeping the weights in...
Thanks, looks good now. I thought about it more and I *might* be able to get FP8 support working for just the PixArt model. I'll take a shot at it...
Alright, I got started on proper FP8 support on [this branch](https://github.com/city96/ComfyUI_ExtraModels/tree/ops). I think it's doable, though I don't have much free time these days. In the meantime I just added...
>Value not in list: t5v11_name: 'pytorch_model-00001-of-00002.bin' not in [] Did you download T5 and place it in `models/t5` as per the readme?
Are you selecting the model from the dropdown in the t5 loader?
Hmmm, gave it a quick test and manually installing the latest timm seems to work without code changes, which is odd because I've had some reports in the past about...
I've pushed an update that should allow changing the path to T5 with just the extra model paths yaml. As for what you should put in that file, it depends...
From that snipped you pasted, the "configs" bit still has the full path, not sure if that's causing the whole thing to be ignored. I gave it a quick test...
Click where it says "unidentified" and select it from the list?