How can I make this work for AMD GPU's on Windows 10
I have found a guide for AMD GPU's but it's for linux. Any help is appreciated!
for now it's linux only for AMD
but there still some workarounds:
- DirectML: https://learn.microsoft.com/en-us/windows/ai/directml/gpu-pytorch-windows
- WSL2 but it's still buggy
@phineas-pta does AMD+WSL even work ? Or some kind of docker?
rule of thumb: you need a high tier brain to do ML on AMD (unless we can change that paradigm)
@BarfingLemurs it isn't docker, ~~it does work but not all the time for everyone~~ edit: it doesnt work (for now)
disclaimer: i don't have any amd gpu, but i saw people struggling with wsl + amd
for amd you'll have to wait several years for rocm to be as matured as cuda
@phineas-pta I want to read some info on this and try it, but I can't find anything about it on github or google. any pointers to it would be a massive help!
@BarfingLemurs sorry my bad it doesnt work at all
+1
@dennis-gonzales dual-boot and using linux is the only option
good news: ROCm coming to Windows: https://www.tomshardware.com/news/amd-rocm-comes-to-windows-on-consumer-gpus
bad news: we don't know when
@BarfingLemurs seem like wsl2 + docker can work see #1631
@BarfingLemurs seem like wsl2 + docker can work see #1631
Author of #1631 here.
2 devices need to be passed into docker (/dev/kfd, /dev/dri) for ROCm, if AMD's drivers on Windows / WSL2 provides those devices it might work, otherwise an alternate device reservation syntax might be needed at the end of the docker-compose.yml file... I don't have Windows to test, as it is it works for ROCm on Linux hosts with the amdgpu driver, I'm running NixOS as the docker host. Hopefully the PR gets merged...
@deftdawg I would love to test this, but I no longer have the amd setup
AMD's drivers on Windows / WSL2 provides those devices it might work
I'm uncertain about this, in particular I haven't seen any step by step for this. I don't know if installing amd drivers on windows puts everything in one place like linux, and if it will work seamlessly, again, I haven't found any info :P
@phineas-pta Rocm was released for windows recently, any possibility of enabling that option for windows in the webui cli now?
I was hoping I could run this. Now I know I won't be able to (amd gpu doesn't support rocm).
@Drael64 its has llama.cpp rocm support on windows, and haven't heard about exllama.
This issue has been closed due to inactivity for 6 weeks. If you believe it is still relevant, please leave a comment below. You can tag a developer in your comment.
FUCKING DEV MAKE THIS READY FOR AMD WINDOWS OR GIVE EVERY A NVIDIA!