diffusers
diffusers copied to clipboard
Stable Diffusion ONNX requires cpu `onnxruntime` even if CUDA version is installed
Describe the bug
The ONNX support doesn't work with CUDAExecutionProvider
I installed onnxruntime-gpu
Running
import onnxruntime as ort ort.get_device()
results
GPU
and
ort.get_available_providers()
results
['CPUExecutionProvider', 'TensorrtExecutionProvider', 'CUDAExecutionProvider']
but diffusers complains onnxruntime not installed and wants me to install the cpu version(pip install onnxruntime).
Reproduction
Install
pip install onnxruntime-gpu
and run
from diffusers import StableDiffusionOnnxPipeline
pipe = StableDiffusionOnnxPipeline.from_pretrained(
"CompVis/stable-diffusion-v1-4",
revision="onnx",
provider="CUDAExecutionProvider",
use_auth_token=true,
)
Logs
No response
System Info
-
diffusers
version: 0.3.0 - Platform: Linux-5.10.133+-x86_64-with-Ubuntu-18.04-bionic
- Python version: 3.7.13
- PyTorch version (GPU?): 1.12.1+cu113 (True)
- Huggingface_hub version: 0.9.1
- Transformers version: 4.21.3
- Using GPU in script?: Yes
- Using distributed or parallel set-up in script?: No
Works for me. Could you please paste the exact error messages you get when you run the example? Thanks!
Here is the Colab Notebook. replicating the error.
This is my PR trying to fix this issues: #440
Oh, I see, I didn't understand it initially; it works for me because I have both packages installed.
Thanks @Vijayabhaskar96 and @SkyTNT, what do you think @anton-l ?
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
@anton-l is this solved now?
Fixed in https://github.com/huggingface/diffusers/pull/440 :+1:
Thanks!