ComfyUI icon indicating copy to clipboard operation
ComfyUI copied to clipboard

Using OpenVINO to accelerate on Intel CPUs

Open Micraow opened this issue 1 year ago • 2 comments

I am using CPU for stable diffusion XL ,and I find it is very slow. I learned that by converting the model into OpenVINO IR can we accelerate the progress.And OpenVINO pipeline is supported by diffusers. Could you please add the function to use OpenVINO IR .bin files as the model like .ckpt and safetensors ?

Anyway, if we can add support for onnx model, then it's OK.

Also, if there are any other ways to accelerate inference on Intel CPU with low memory, please let me know. Thank you!🙏

Micraow avatar Jan 06 '24 14:01 Micraow

Up

leo-smi avatar Feb 12 '24 19:02 leo-smi