Hentai-Diffusion icon indicating copy to clipboard operation
Hentai-Diffusion copied to clipboard

Torch is not able to use GPU

Open FoxMaccloud opened this issue 1 year ago • 6 comments

I'm attempting to run webui.sh on an Arch Linux install (Linux Arch 6.0.9-arch1-1). when launching python I'm getting this error;

################################################################
Launching launch.py...
################################################################
Python 3.10.8 (main, Nov  1 2022, 14:18:21) [GCC 12.2.0]
Commit hash: 828438b4a190759807f9054932cae3a8b880ddf1
Traceback (most recent call last):
  File "/home/foxmaccloud/stable-diffusion-webui/launch.py", line 250, in <module>
    prepare_enviroment()
  File "/home/foxmaccloud/stable-diffusion-webui/launch.py", line 174, in prepare_enviroment
    run_python("import torch; assert torch.cuda.is_available(), 'Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check'")
  File "/home/foxmaccloud/stable-diffusion-webui/launch.py", line 58, in run_python
    return run(f'"{python}" -c "{code}"', desc, errdesc)
  File "/home/foxmaccloud/stable-diffusion-webui/launch.py", line 34, in run
    raise RuntimeError(message)
RuntimeError: Error running command.
Command: "/home/foxmaccloud/stable-diffusion-webui/venv/bin/python3" -c "import torch; assert torch.cuda.is_available(), 'Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check'"
Error code: 1
stdout: <empty>
stderr: Traceback (most recent call last):
  File "<string>", line 1, in <module>
AssertionError: Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check

I know that cuda is Nvidia technology, so was thinking it could be an issue with me having an AMD gpu?

FoxMaccloud avatar Nov 23 '22 14:11 FoxMaccloud

I'm on Arch as well, for AMD you can use ROCm. You probably want opencl-amd-dev. Then I got the stable Linux pip ROCm packages from pytorch.

And then it depends on your exact GPU. I got a RX 5700 (gfx1010) and fake it as gfx1030 using export HSA_OVERRIDE_GFX_VERSION=10.3.0

After that the webui just works.

DarkShadow44 avatar Nov 25 '22 00:11 DarkShadow44

I was able to make a fix myself. I added this line in launch.py commandline_args = os.environ.get('COMMANDLINE_ARGS', "--skip-torch-cuda-test --precision full --no-half")

Also when running webui.py. I noticed that it just cloned a git repo to my home directory and uses that. Now I'm just launching running python3 launch.py...

What kinda sucks tho is that it is using my CPU and not my GPU. I have an AMD Vega 64 gpu.

FoxMaccloud avatar Nov 25 '22 01:11 FoxMaccloud

Accidentally closed 🤦

FoxMaccloud avatar Nov 25 '22 01:11 FoxMaccloud

What does rocminfo give?

DarkShadow44 avatar Nov 25 '22 19:11 DarkShadow44

I'm attempting to run webui.sh on an Arch Linux install (Linux Arch 6.0.9-arch1-1). when launching python I'm getting this error;

################################################################
Launching launch.py...
################################################################
Python 3.10.8 (main, Nov  1 2022, 14:18:21) [GCC 12.2.0]
Commit hash: 828438b4a190759807f9054932cae3a8b880ddf1
Traceback (most recent call last):
  File "/home/foxmaccloud/stable-diffusion-webui/launch.py", line 250, in <module>
    prepare_enviroment()
  File "/home/foxmaccloud/stable-diffusion-webui/launch.py", line 174, in prepare_enviroment
    run_python("import torch; assert torch.cuda.is_available(), 'Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check'")
  File "/home/foxmaccloud/stable-diffusion-webui/launch.py", line 58, in run_python
    return run(f'"{python}" -c "{code}"', desc, errdesc)
  File "/home/foxmaccloud/stable-diffusion-webui/launch.py", line 34, in run
    raise RuntimeError(message)
RuntimeError: Error running command.
Command: "/home/foxmaccloud/stable-diffusion-webui/venv/bin/python3" -c "import torch; assert torch.cuda.is_available(), 'Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check'"
Error code: 1
stdout: <empty>
stderr: Traceback (most recent call last):
  File "<string>", line 1, in <module>
AssertionError: Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check

I know that cuda is Nvidia technology, so was thinking it could be an issue with me having an AMD gpu?

Your best option is probably looking on YouTube for ways to run on amd cards. I don't have any and haven't tested anything on them.

Delcos avatar Nov 25 '22 21:11 Delcos

I was able to make a fix myself. I added this line in launch.py commandline_args = os.environ.get('COMMANDLINE_ARGS', "--skip-torch-cuda-test --precision full --no-half")

Also when running webui.py. I noticed that it just cloned a git repo to my home directory and uses that. Now I'm just launching running python3 launch.py...

What kinda sucks tho is that it is using my CPU and not my GPU. I have an AMD Vega 64 gpu.

If you use "--skip-torch-cuda-test" then it will only run on the CPU, which will take a lot longer to create images (2 or 3 minutes instead of ~10 seconds)

Raph404 avatar Dec 22 '22 20:12 Raph404