llama-cpp-python icon indicating copy to clipboard operation
llama-cpp-python copied to clipboard

Cannot install current version of llama-cpp-python 0.3.16 on Windows (backend independent)

Open devtobi opened this issue 3 months ago • 7 comments

Prerequisites

Please answer the following questions for yourself before submitting an issue.

Expected Behavior

When installing llama-cpp-python with the respective environment variables mentioned in the docs set, the installation of the dependency should succeed.

Current Behavior

The installation failed. See the error log below.

Environment and Context

Environment variables:

  • CMAKE_GENERATOR=MinGW Makefiles
  • CMAKE_ARGS=-DGGML_OPENBLAS=on -DGGML_BLAS=ON -DGGML_BLAS_VENDOR=OpenBLAS -DGGML_OPENBLAS=on -DCMAKE_C_COMPILER=E:/Development/w64devkit/bin/gcc.exe -DCMAKE_CXX_COMPILER=E:/Development/w64devkit/bin/g++.exe

I also tried this with Vulkan and ROCm backend and got exactly the same error.

System and dependency information:

  • AMD Ryzen 7 7800X3D
  • AMD 7900 XTX
  • Operating System: Windows
  • Python version: 3.13.6
  • w64devkit version: 2.4.0
  • make version: GNU Make 4.4.1 Built for x86_64-w64-mingw32
  • g++ version: g++.exe (GCC) 15.2.0

Failure Information (for bugs)

The installation failed (see error log below). The error stays the same no matter what backend (OpenBLAS, Vulkan, ROCm) is chosen

Steps to Reproduce

  1. Set the environment variables as mentioned above
  2. Install the dependency via pip or any other package manager (I use uv sync)
  3. See the failing error message

Failure Logs

[ 38%] Linking CXX shared library ..\..\..\..\bin\ggml-cpu.dll
      cd /d C:\Users\Tobias\AppData\Local\Temp\tmppea8rsz0\build\vendor\llama.cpp\ggml\src &&
      C:\Users\Tobias\AppData\Local\uv\cache\builds-v0\.tmpqQBqx0\Lib\site-packages\cmake\data\bin\cmake.exe -E
      cmake_link_script CMakeFiles\ggml-cpu.dir\link.txt --verbose=1
      C:\Users\Tobias\AppData\Local\uv\cache\builds-v0\.tmpqQBqx0\Lib\site-packages\cmake\data\bin\cmake.exe -E rm -f
      CMakeFiles\ggml-cpu.dir/objects.a
      E:\Development\w64devkit\bin\ar.exe qc CMakeFiles\ggml-cpu.dir/objects.a @CMakeFiles\ggml-cpu.dir\objects1.rsp
      E:\Development\w64devkit\bin\g++.exe -O3 -DNDEBUG -shared -o ..\..\..\..\bin\ggml-cpu.dll
      -Wl,--out-implib,libggml-cpu.dll.a -Wl,--major-image-version,0,--minor-image-version,0 -Wl,--whole-archive
      CMakeFiles\ggml-cpu.dir/objects.a -Wl,--no-whole-archive @CMakeFiles\ggml-cpu.dir\linkLibs.rsp
      make[2]: Leaving directory 'C:/Users/Tobias/AppData/Local/Temp/tmppea8rsz0/build'
      make[1]: Leaving directory 'C:/Users/Tobias/AppData/Local/Temp/tmppea8rsz0/build'

CMake Warning (dev) at CMakeLists.txt:21 (install):
        Target ggml has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
      Call Stack (most recent call first):
        CMakeLists.txt:109 (llama_cpp_python_install_target)
      This warning is for project developers.  Use -Wno-dev to suppress it.

E:\Development\w64devkit\bin/ld.exe: CMakeFiles\ggml-cpu.dir/objects.a(ggml-cpu.c.obj):ggml-cpu.c:(.text+0x31df):
      undefined reference to `GOMP_barrier'
      E:\Development\w64devkit\bin/ld.exe: CMakeFiles\ggml-cpu.dir/objects.a(ggml-cpu.c.obj):ggml-cpu.c:(.text+0x44fd):
      undefined reference to `GOMP_barrier'
      E:\Development\w64devkit\bin/ld.exe: CMakeFiles\ggml-cpu.dir/objects.a(ggml-cpu.c.obj):ggml-cpu.c:(.text+0x4587):
      undefined reference to `GOMP_barrier'
      E:\Development\w64devkit\bin/ld.exe: CMakeFiles\ggml-cpu.dir/objects.a(ggml-cpu.c.obj):ggml-cpu.c:(.text+0x4f95):
      undefined reference to `GOMP_barrier'
      E:\Development\w64devkit\bin/ld.exe: CMakeFiles\ggml-cpu.dir/objects.a(ggml-cpu.c.obj):ggml-cpu.c:(.text+0x5aad):
      undefined reference to `GOMP_single_start'
      E:\Development\w64devkit\bin/ld.exe: CMakeFiles\ggml-cpu.dir/objects.a(ggml-cpu.c.obj):ggml-cpu.c:(.text+0x5ab6):
      undefined reference to `GOMP_barrier'
      E:\Development\w64devkit\bin/ld.exe: CMakeFiles\ggml-cpu.dir/objects.a(ggml-cpu.c.obj):ggml-cpu.c:(.text+0x5abb):
      undefined reference to `omp_get_thread_num'
      E:\Development\w64devkit\bin/ld.exe: CMakeFiles\ggml-cpu.dir/objects.a(ggml-cpu.c.obj):ggml-cpu.c:(.text+0x5ae1):
      undefined reference to `omp_get_num_threads'
      E:\Development\w64devkit\bin/ld.exe: CMakeFiles\ggml-cpu.dir/objects.a(ggml-cpu.c.obj):ggml-cpu.c:(.text+0x6b40):
      undefined reference to `GOMP_parallel'
      E:\Development\w64devkit\bin/ld.exe: CMakeFiles\ggml-cpu.dir/objects.a(ggml-cpu.c.obj):ggml-cpu.c:(.text+0x90c):
      undefined reference to `GOMP_barrier'
      collect2.exe: error: ld returned 1 exit status
      make[2]: *** [vendor\llama.cpp\ggml\src\CMakeFiles\ggml-cpu.dir\build.make:341: bin/ggml-cpu.dll] Error 1
      make[1]: *** [CMakeFiles\Makefile2:350: vendor/llama.cpp/ggml/src/CMakeFiles/ggml-cpu.dir/all] Error 2
      make: *** [Makefile:135: all] Error 2

      *** CMake build failed

hint: This usually indicates a problem with the package or the build environment.

devtobi avatar Aug 18 '25 23:08 devtobi

This happens because the cmake package is not installed by default for Windows. To install it download the exe file for windows from this link - https://cmake.org/download/

According to your system specs.

mayanksinghobvs avatar Aug 19 '25 10:08 mayanksinghobvs

@mayanksinghobvs Even after installing Cmake (latest version 4.1.0) and adding it to the PATH, the error stays the same.

devtobi avatar Aug 19 '25 12:08 devtobi

are you checking this in VS code integrated terminal or separate power shell window? have you restarted your system after install?

mayanksinghobvs avatar Aug 19 '25 12:08 mayanksinghobvs

I am using a dedicated PowerShell window 😄 And yes of course I restarted my system and cmake --version gives a positive response.

devtobi avatar Aug 19 '25 12:08 devtobi

does gcc --version work? If yes, then download from - https://visualstudio.microsoft.com/visual-cpp-build-tools/ and install c++ build tools from it and retry

mayanksinghobvs avatar Aug 19 '25 12:08 mayanksinghobvs

Yes gcc --version works, as the binary is coming from the w64devkit (which is also mentioned in the llama-cpp-python docs, see https://llama-cpp-python.readthedocs.io/en/latest/#windows-notes)

However even after installating the visual cpp build tools you mentioned the error still stays unchanged.

devtobi avatar Aug 19 '25 12:08 devtobi

Have similar errors: D:\w64devkit\bin\g++.exe -O3 -DNDEBUG -shared -o ......\bin\libllama.dll -Wl,--out-implib,libllama.dll.a -Wl,--major-image-version,0,--minor-image-version,0 -Wl,--whole-archive CMakeFiles\llama.dir/objects.a -Wl,--no-whole-archive @CMakeFiles\llama.dir\linkLibs.rsp mingw32-make.exe[2]: *** [vendor\llama.cpp\src\CMakeFiles\llama.dir\build.make:530: bin/libllama.dll] Error 1 mingw32-make.exe[2]: Leaving directory 'E:/TEMP/tmpww2l6owm/build' mingw32-make.exe[1]: *** [CMakeFiles\Makefile2:357: vendor/llama.cpp/src/CMakeFiles/llama.dir/all] Error 2 mingw32-make.exe[1]: Leaving directory 'E:/TEMP/tmpww2l6owm/build' mingw32-make.exe: *** [Makefile:135: all] Error 2

I see it's impossible to get working modules, at least for Python 3.13. For Raspian OS it works.. And even recognize the Gemma model. In ready for use built WHL's there are old LLAMA.DLL and others included. They don't work with modern models....

Maybe it would better to refuse from building for Windows just to save people's time.

longtolik avatar Sep 18 '25 09:09 longtolik