StableSwarmUI icon indicating copy to clipboard operation
StableSwarmUI copied to clipboard

Automate Support For Old AMD GPUs (DirectML)

Open hdawod opened this issue 2 years ago • 13 comments

Hi, I followed the default installation process. but when run StableSwarmUI I receive error message Some backends have errored on the server. Check the server logs for details. I have MSI Alpha 15 | Ryzen 7 with AMD Radeon RX. Does StableSwarmUI support my AMD gpu? Any help please?

01 02

hdawod avatar Aug 04 '23 20:08 hdawod

the platform is windows 10

hdawod avatar Aug 04 '23 20:08 hdawod

Check the console window, there should be an error message there

mcmonkey4eva avatar Aug 04 '23 21:08 mcmonkey4eva

It should be a full path EG C:\Comfy\ComfyUI\main.py, give that a shot

midihex avatar Aug 05 '23 00:08 midihex

@mcmonkey4eva I checked the console window and It shows an error regarding the GPU. I have 8 Gb AMD gpu but the error asking for Nvidia @midihex This drive me crazy. I changed the path with no luck 03

hdawod avatar Aug 05 '23 04:08 hdawod

On the comfyui GitHub it mentions some extra steps to make cui work with amd,

pip install torch-directml And add --directml in the backend extra Args box

But do check this information for yourself first.

I'm assuming auto AMD config isn't in the current scope of Swarm, it's more of a comfyui setup issue

midihex avatar Aug 05 '23 04:08 midihex

Thanks @midihex I install directml and add the script in the ARGS box but still show some odd error. ModuleNotFoundError: No module named 'torch_directml' I tried to solve this error before but failed. sorry if I give you headache! here is the console window message image

hdawod avatar Aug 05 '23 05:08 hdawod

Might be worth asking for help over on the comfyui hub

Also I didn't realise that you'd installed comfyui via the swarm install process so your original link that started with dlbackend/ was fine. My example was for if you had comfyui already installed.

But still. Error looks to be comfyui related When you installed torch direct ml -no errors?

midihex avatar Aug 05 '23 08:08 midihex

Thanks @midihex I install directml and add the script in the ARGS box but still show some odd error. ModuleNotFoundError: No module named 'torch_directml' I tried to solve this error before but failed. sorry if I give you headache! here is the console window message image

hello, i think i know what you are trying to do. I am on mac but i will give you suggestion and try it: 1- go to the file dlbackend/ComfyUI. make sure you activate the venv 2-source venv/bin/activate 3-pip install -r requirements.txt or whatever you want to install in your environment 4-launch your script 5-put StartScript path to main.py 6-put ExtraArgs i use --normalvram

let me know if that work for you

LinuxAITottiLabs avatar Aug 05 '23 22:08 LinuxAITottiLabs

Hi @tottiaa Could you explain more how to do it? I tried to follow the steps but venv is not there in comfyui folder. I installed virtual environment and venv folder become there. I ran activate file by double click but nothing happen. steps 3 to 5 can't do..

hdawod avatar Aug 07 '23 04:08 hdawod

There isn't a venv in the default comfy install. You'll want to

  • Open StableSwarmUI\dlbackend\comfy
  • open a command line there
  • run python_embeded\python.exe -m pip install torch-directml

EDIT: and as said above, add --directml in the backend Extra Args box

EDIT2: according to comfy, directml requires Torch 2.0 (as opposed to current 2.1) so you'll also need to backdate torch.

mcmonkey4eva avatar Aug 07 '23 04:08 mcmonkey4eva

Update: I now have the logic to automatically install with AMD compatibility... in theory.

In practice, on Windows, torch-directml is a deeply broken library that's basically unsupported and so the installer won't actually work until/unless they fix it. Notably Comfy uses Python 3.11 or 3.12, and torch 2.1 - directml requires Python 3.10 and torch 2.0. Why such strict version locks, idek. But they're in the way of compatibility for any installation method that isn't manual hackery.

For Linux users I expect it should just work via rocm (untested).

mcmonkey4eva avatar Jan 10 '24 02:01 mcmonkey4eva

I can confirm the AMD install doesn't work, installs the nvidia torch instead of the rocm version

ghost avatar Jan 30 '24 07:01 ghost

the Windows installer I tested and I managed to get to actually work - barely. On an RX 7900 XT (20 GiB VRAM) it installs and works, but you need to disable previews and it fills all 20 GiB of VRAM and takes forever to generate anything in SDXL. I hate it.

zluda is probably a better option but that has a pile of its own complications. Argh.

mcmonkey4eva avatar Jun 06 '24 15:06 mcmonkey4eva