Deep Gandhi
Deep Gandhi
## Current blockers: - WSL does not support rocm-smi which makes usage tracking difficult. - pytorch 2.7+rocm6.3 binary doesn't work on WSL as it doesn't detect `torch.cuda.is_available()` even though `torch.version.hip`...
Thanks for the link @charmandercha! I was able to resolve this with some advice from the issue [here](https://github.com/ROCm/ROCm/issues/4749). Unfortunately we wouldn't be able to track GPU usage currently as rocm-smi...
To anyone tracking this: We now support AMD GPUs - install instructions [here](https://transformerlab.ai/docs/install/install-on-amd) :) Please provide some feedback and we'd be happy to keep improving on this! (cc @charmandercha)
Hi, We recommend using this on native Ubuntu (22.04 or 24.04 for now since thats been tested). Pop OS has been having some issues with the newer rocm bare-metal installations...
Except flash-attn, everything should be shown! We got rid of flash-attn so that not installed message can be discarded and we’ll remove it soon. We use pyrsmi so incase you’d...
> It ran on pop os, I just had to remove all the folders from the transformer-lab installation that I had and now even the AMD icon is being rendered....
AMD is working now, closing this. Please re-open if there are any issues!
The chown solution had not worked as the user had tried it earlier, keeping this open still to check up on the solution
Hey @VineeTagarwaL-code, Just following up to see if you were still interested in working on this issue?
Hi @VineeTagarwaL-code, I am opening this issue up for others to work on since its been 2 months with no activity. Please ping us back if you end up working...