RandomGitUser321
RandomGitUser321
> I think you're using a model from InstantX and that they aren't supported yet. I might be mistaken, but I think only the Flux Controlnets by XLabs-AI are supported...
> I updated it just now an it still uses 10 Gb for a a single lora rank 64: : prompt_id = queue_prompt(prompt)["prompt_id"] output_images =...
It's likely doing some kind of casting up to float32 or 16 and then back down to fp8, even if you're using an fp8 version of the model. It might...
Yeah I think I was on to something about it upcasting: `supported_inference_dtypes = [torch.bfloat16, torch.float32]` https://github.com/comfyanonymous/ComfyUI/blob/1c08bf35b49879115dedd8ec6bc92d9e8d8fd871/comfy/supported_models.py#L631
> Since upscaling is an important feature, it is very important to remove these artifacts to preserve the original image content. These artifacts are difficult to remove at low denosing...