sherpa-onnx
sherpa-onnx copied to clipboard
[help wanted] Support AMD GPU
https://github.com/k2-fsa/sherpa-onnx/pull/153 added support for NVIDIA GPUs.
We also need to support AMD GPUs.
We need to integrate https://onnxruntime.ai/docs/execution-providers/MIGraphX-ExecutionProvider.html into sherpa-onnx.
Help from the community is highly appreciated.
Please leave a comment if you would like to help. We can help you if you meet any problems during the intergration.
I would point that an OpenCL implementation would be better, since it will be vendor independent. But I've read that an OpenCL implemenation in ONNX is very difficult. For AMD GPUs there is a compatibility layer called ZLUDA, but I haven't tested it.
We also need to support AMD GPUs.
Maybe we can inspire from pykeio/ort it uses AMD ROCm
Or from ggerganov/whisper.cpp that uses CNugteren/CLBlast
Another promising option https://github.com/ROCm/HIP
@csukuangfj Is there a specific reason you prefer MIGraphX? Since it depends on many 3rd libs while Rocm looks simpler. In addition it seems that it's gonna be pretty simple to add support for it similar to the PR of Cuda. But for some reason the PR with cuda didn't prepared for any other GPU backend so need refactor there.
Is there a specific reason you prefer MIGraphX?
No, I listed it since when I searched with Google for how to use AMD GPUs with onnxruntime, it gave me the link https://onnxruntime.ai/docs/execution-providers/MIGraphX-ExecutionProvider.html
I don't have any preferences for which one to use, as long as we can achieve the goal: Support AMD GPU.
If you could contribute, that would be great. If you have any issues, we would like to help you at any time..