whisper
whisper copied to clipboard
Add support for AMD GPU (ROCm Platform)
PyPI name of openAI Triton for ROCm Platform is pytorch-triton-rocm, so that modify setup.py to install correct Triton package for ROCm platform. Also modify README.md to add instruction to install on ROCm Platform. Tested on ROCm Platform with AMD GPUs.
What OS is this for?
OS is Linux
Instead of using environment variable, i suggest to use extras_require
as could be see in here. Another option is to automatically detect if it is using ROCm platform.
Based on the suggestion, remove environmental variable and add function to detect ROCm Platform automatically.
Will this work with generic AMD gpus, ie newer integrated gpus?
Will this work with generic AMD gpus, ie newer integrated gpus?
@x86Gr This is the list of supported GPUs
https://rocm.docs.amd.com/en/latest/release/gpu_os_support.html
@x86Gr @glangford This link should work: https://rocm.docs.amd.com/en/latest/release/gpu_os_support.html#linux-supported-gpus
Any particular reason the PR only selected a subset of supported AMD GPUs?
gfx1030 or gfx1100 appear to be missing
@vadimkantorov @Reviewer of this PR. Is there any update about review to merge this PR? Is there anything I can help to speed up the process? Thanks!