stable-diffusion-webui-docker
stable-diffusion-webui-docker copied to clipboard
ROCM support in Docker
Closes issue #63
Added a new docker profile for AUTOMATIC1111 / AMD Rocm support.
docker compose --profile auto-rocm up
Tried on my 7900 XTX, works okay.
Xformers has an experimental AMD support in 0.0.25, but I could not get it to work. https://github.com/facebookresearch/xformers/commit/44b0d075db02affa7eb006c2980fa509d821f91a
This works for me on my 6750 XT, but I have to add HSA_OVERRIDE_GFX_VERSION=10.3.0 as an ENV variable in Dockerfile.rocm
hey man i tried testing your branch but is rx 550-580 supported? i got this error
webui-docker-auto-rocm-1 | Stable diffusion model failed to load
webui-docker-auto-rocm-1 |
webui-docker-auto-rocm-1 | rocBLAS error: Cannot read /opt/rocm/lib/rocblas/library/TensileLibrary.dat: Illegal seek for GPU arch : gfx803
webui-docker-auto-rocm-1 | List of available TensileLibrary Files :
any idea how to make it supported?
i think maybe pytorch version is too new? or rocm version, but i cant find which version to install
i tried changin from rocm6 to rocm 5.7 and i got this error RuntimeError: "LayerNormKernelImpl" not implemented for 'Half' sdlog.log what did i do worngly :/
got it run but still uses my cpu instead of gpu workingbutcpu.log
Do you think it would be possible to merge this feature @AbdBarho ? I would love to take advantages of the update of your repo, but still using my AMD GPU.
Confirmed on Ubuntu 24.04 with a 7800xt
Confirmed working on Manjaro with a 7800 GRE. I had to add HIP_VISIBLE_DEVICES=0 to the docker-compose.yml