vllm
vllm copied to clipboard
[Doc]: ROCm installation instructions do not work
📚 The doc issue
Following the instructions at https://docs.vllm.ai/en/latest/getting_started/amd-installation.html#build-from-source-rocm, using the exact Docker image mentioned (pytorch_rocm6.1.2_ubuntu20.04_py3.9_pytorch_staging.sif, although with a custom Python venv and Pytorch install), and run into the following error when running python setup.py develop
:
Building PyTorch for GPU arch: gfx90a
-- Could NOT find HIP: Found unsuitable version "0.0.0", but required is at least "1.0" (found /opt/rocm)
HIP VERSION: 0.0.0
CMake Warning at .venv/lib/python3.10/site-packages/torch/share/cmake/Torch/TorchConfig.cmake:22 (message):
static library kineto_LIBRARY-NOTFOUND not found.
Call Stack (most recent call first):
.venv/lib/python3.10/site-packages/torch/share/cmake/Torch/TorchConfig.cmake:120 (append_torchlib_if_found)
CMakeLists.txt:67 (find_package)
CMake Error at CMakeLists.txt:108 (message):
Can't find CUDA or HIP installation.
The docker image should have a proper HIP setup, right? hipconfig
says: HIP version : 6.1.40093-bd86f1708
.
Suggest a potential alternative/fix
No response