rocm llama.cpp not recognizing amd instinct mi50 (gfx906)
rocm llama.cpp not recognizing amd instinct mi50 (gfx906)
@iEddie-cmd gfx906 is not supported on the ROCM runtime in LM Studio
GFX1030, 1100 and 1101 are the supported AMD GPU's for ROCM in LM Studio, you will need to use the vulkan llama.cpp runtime for gpu offload.
@YorkieDev I have an GXF1030 and it won't work.
******
Agent 2
*******
Name: gfx1030
Uuid: GPU-5bbb5cb003adfcde
Marketing Name: AMD Radeon RX 6800 XT
Vendor Name: AMD
@gbschenkel 6800XT should work with ROCM. Can you open up a new issue in: https://github.com/lmstudio-ai/lmstudio-bug-tracker
With the following LM Studio screenshots: LM Hardware (CTRL + SHIFT + H) LM Runtimes (CTRL + SHIFT + R)
@YorkieDev the ROCm is working back on v1.32.1 https://github.com/lmstudio-ai/lmstudio-bug-tracker/issues/658