TensorRT-LLM
TensorRT-LLM copied to clipboard
Cannot install TensorRT-LLM on Windows - No CUDA compiler found
System Info
- main branch of trtllm
- windows 11, bare metal build from source
Who can help?
@byshiue
Information
- [X] The official example scripts
- [ ] My own modified scripts
Tasks
- [X] An officially supported task in the
examples
folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below)
Reproduction
I followed along the steps in the windows README for bare-metal installation on Windows
Expected behavior
Build successful, wheel built
actual behavior
-- The CXX compiler identification is MSVC 19.38.33135.0 -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: C:/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.38.33130/bin/Hostx86/x86/cl.exe - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- NVTX is disabled -- Importing batch manager -- Building PyTorch -- Building Google tests -- Building benchmarks -- Looking for a CUDA compiler -- Looking for a CUDA compiler - NOTFOUND CMake Error at CMakeLists.txt:118 (message): No CUDA compiler found
-- Configuring incomplete, errors occurred!
Traceback (most recent call last):
File "C:\TensorRT-LLM-Win\scripts\build_wheel.py", line 310, in
additional notes
I did try setting CUDACXX and PATH but no combination seems to work
checked with the rel branch as well, same issue
Run the setup script provided. The script you are looking for is under the windows
folder. You can skip Python
and MPI
if they are already correctly installed. Make sure to run powershell as administrator. The script will install CUDA 12.2 and add it to path. After installation, kill the current terminal and open a new one. See below.