x-stable-diffusion
x-stable-diffusion copied to clipboard
Currently we only support a100 and t4 gpus. We may support other gpus in the future
When I enter "stochasticx stable-diffusion deploy --type aitemplate," the following message appears: "WARNING | stochasticx.client.stable_diffusion | Currently we only support a100 and t4 gpus. We may support other gpus in the future." My GPU is an RTX4090. Please add it to the list of supported devices.
Hi @appleatiger, we have supported for more gpus like rtx 3080, 3090, 4080, 4090,... Please reinstall our latest stochasticx library , try again and let us know if any issue. Thank you.
"pip uninstall stochasticx" then "pip install stochasticx","stochasticx login","stochasticx stable-diffusion deploy --type aitemplate" a few minute later ,it shows "2023-01-05 09:42:02,949 | ERROR | stochasticx.utils.docker | 500 Server Error for http+docker://localnpipe/v1.41/containers/331de07878ab9cca10e9b8104bdd90021fd6804cf2bcf6ddafb5ba729c1b9bc5/start: Internal Server Error ("failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: error during container init: error running hook #0: error running hook: exit status 1, stdout: , stderr: Auto-detected mode as 'legacy' nvidia-container-cli: mount error: file creation failed: /var/lib/docker/overlay2/90bf537a32793e05ef4f026edae56918ebf2b1052ee602c13bf8a99572adde6d/merged/usr/lib/x86_64-linux-gnu/libnvidia-ml.so.1: file exists: unknown")"
a docker image "public.ecr.aws/t8g5g2q5/stable-diffusion:aitemplate " and a container "stochasticx_stable_diffusion" is created。
Hi @appleatiger, have you install nvidia-runtime in your machine and set the docker default runtime is nvidia-runtime ?