text-generation-webui icon indicating copy to clipboard operation
text-generation-webui copied to clipboard

How can I make this work for AMD GPU's on Windows 10

Open D0WN70AD opened this issue 3 years ago • 10 comments

I have found a guide for AMD GPU's but it's for linux. Any help is appreciated!

D0WN70AD avatar Apr 09 '23 16:04 D0WN70AD

for now it's linux only for AMD

but there still some workarounds:

  • DirectML: https://learn.microsoft.com/en-us/windows/ai/directml/gpu-pytorch-windows
  • WSL2 but it's still buggy

phineas-pta avatar Apr 09 '23 17:04 phineas-pta

@phineas-pta does AMD+WSL even work ? Or some kind of docker?

rule of thumb: you need a high tier brain to do ML on AMD (unless we can change that paradigm)

BarfingLemurs avatar Apr 10 '23 02:04 BarfingLemurs

@BarfingLemurs it isn't docker, ~~it does work but not all the time for everyone~~ edit: it doesnt work (for now)

disclaimer: i don't have any amd gpu, but i saw people struggling with wsl + amd

for amd you'll have to wait several years for rocm to be as matured as cuda

phineas-pta avatar Apr 10 '23 10:04 phineas-pta

@phineas-pta I want to read some info on this and try it, but I can't find anything about it on github or google. any pointers to it would be a massive help!

BarfingLemurs avatar Apr 10 '23 14:04 BarfingLemurs

@BarfingLemurs sorry my bad it doesnt work at all

phineas-pta avatar Apr 10 '23 14:04 phineas-pta

+1

dennis-gonzales avatar Apr 13 '23 09:04 dennis-gonzales

@dennis-gonzales dual-boot and using linux is the only option

BarfingLemurs avatar Apr 13 '23 17:04 BarfingLemurs

good news: ROCm coming to Windows: https://www.tomshardware.com/news/amd-rocm-comes-to-windows-on-consumer-gpus

bad news: we don't know when

phineas-pta avatar Apr 15 '23 21:04 phineas-pta

@BarfingLemurs seem like wsl2 + docker can work see #1631

phineas-pta avatar May 03 '23 23:05 phineas-pta

@BarfingLemurs seem like wsl2 + docker can work see #1631

Author of #1631 here.

2 devices need to be passed into docker (/dev/kfd, /dev/dri) for ROCm, if AMD's drivers on Windows / WSL2 provides those devices it might work, otherwise an alternate device reservation syntax might be needed at the end of the docker-compose.yml file... I don't have Windows to test, as it is it works for ROCm on Linux hosts with the amdgpu driver, I'm running NixOS as the docker host. Hopefully the PR gets merged...

deftdawg avatar May 05 '23 03:05 deftdawg

@deftdawg I would love to test this, but I no longer have the amd setup

AMD's drivers on Windows / WSL2 provides those devices it might work

I'm uncertain about this, in particular I haven't seen any step by step for this. I don't know if installing amd drivers on windows puts everything in one place like linux, and if it will work seamlessly, again, I haven't found any info :P

BarfingLemurs avatar May 19 '23 10:05 BarfingLemurs

@phineas-pta Rocm was released for windows recently, any possibility of enabling that option for windows in the webui cli now?

DarkRB avatar Sep 18 '23 07:09 DarkRB

I was hoping I could run this. Now I know I won't be able to (amd gpu doesn't support rocm).

Drael64 avatar Sep 21 '23 15:09 Drael64

@Drael64 its has llama.cpp rocm support on windows, and haven't heard about exllama.

BarfingLemurs avatar Sep 23 '23 00:09 BarfingLemurs

This issue has been closed due to inactivity for 6 weeks. If you believe it is still relevant, please leave a comment below. You can tag a developer in your comment.

github-actions[bot] avatar Nov 04 '23 23:11 github-actions[bot]

FUCKING DEV MAKE THIS READY FOR AMD WINDOWS OR GIVE EVERY A NVIDIA!

dkonhplay avatar Jul 06 '24 18:07 dkonhplay