[enhancement]: Compatibility with ROCm 6
Is there an existing issue for this?
- [X] I have searched the existing issues
Contact Details
No response
What should this feature add?
Required for Ubuntu 24.04 compatibility (minimum fully compatible ROCm version is 6.2)
Alternatives
No response
Additional Content
No response
I've tried to solve this but failed. If any maintainers want to help me with this, I'd happily push a PR to fix it.
What I've done is:
- upgrade the dependency at these lines: https://github.com/invoke-ai/InvokeAI/blob/054bb6fe0a3728e717ba1edf604d0cc5d637bdc4/docker/Dockerfile#L40-L41 to
https://download.pytorch.org/whl/rocm6.1 - run the
docker build --build-arg="GPU_DRIVER=rocm" [...]command - test this with a Radeon 7900xtx on Linux using the expected env vars
- the log output still says it uses the CPU rather than the GPU.
From what I've seen in Ollama, https://github.com/ollama/ollama/blob/6bd8a4b0a1ac15d5718f52bbe1cd56f827beb694/Dockerfile, which does work on my setup, the setup is a lot more involved than here, and it involves using a separate image to keep the main image leaner. I'll look into making a similar change if you think it has a chance of being merged. I don't know if having the container image would be enough, and I don't have a way to test this on a bare-metal machine.
There's a whole cascade of dependencies here. We need torch with rocm6.2 support, which is in nightly builds, and is only available at v2.4.1 (as of now). But we can't yet upgrade to 2.4.1 because of torchvision and diffusers dependencies, which we're working on solving as well. On top of trying to figure out how we can support python 3.12 (i'm sure you already ran into this on Ubuntu 24.04, unless you're using pyenv to support an older version.
TL;DR - this is definitely in the works, I'm actively trying to get it sorted out as we speak. It will happen!
We have recently updated our ROCm images. closing this as complete.