private-gpt icon indicating copy to clipboard operation
private-gpt copied to clipboard

Failed building wheel for llama-cpp-python

Open Thinkcore88 opened this issue 2 years ago • 3 comments

Can't install pip install llama-cpp-python. gcc-11 and g++-11 installed. I am running on VM on Ubuntu.

Building wheel for llama-cpp-python (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [128 lines of output]

  --------------------------------------------------------------------------------
  -- Trying 'Ninja' generator
  --------------------------------
  ---------------------------
  ----------------------
  -----------------
  ------------
  -------
  --
  Not searching for unused variables given on the command line.
  -- The C compiler identification is GNU 11.3.0
  -- Detecting C compiler ABI info
  -- Detecting C compiler ABI info - done
  -- Check for working C compiler: /usr/bin/cc - skipped
  -- Detecting C compile features
  -- Detecting C compile features - done
  -- The CXX compiler identification is GNU 11.3.0
  -- Detecting CXX compiler ABI info
  -- Detecting CXX compiler ABI info - done
  -- Check for working CXX compiler: /usr/bin/c++ - skipped
  -- Detecting CXX compile features
  -- Detecting CXX compile features - done
  -- Configuring done (0.3s)
  -- Generating done (0.0s)
  -- Build files have been written to: /tmp/pip-install-j8zwfqtv/llama-cpp-python_04eb6b87adaa484b896f199f0dced2de/_cmake_test_compile/build
  --
  -------
  ------------
  -----------------
  ----------------------
  ---------------------------
  --------------------------------
  -- Trying 'Ninja' generator - success
  --------------------------------------------------------------------------------
  
  Configuring Project
    Working directory:
      /tmp/pip-install-j8zwfqtv/llama-cpp-python_04eb6b87adaa484b896f199f0dced2de/_skbuild/linux-x86_64-3.10/cmake-build
    Command:
      /tmp/pip-build-env-9hnvetry/overlay/local/lib/python3.10/dist-packages/cmake/data/bin/cmake /tmp/pip-install-j8zwfqtv/llama-cpp-python_04eb6b87adaa484b896f199f0dced2de -G Ninja -DCMAKE_MAKE_PROGRAM:FILEPATH=/tmp/pip-build-env-9hnvetry/overlay/local/lib/python3.10/dist-packages/ninja/data/bin/ninja --no-warn-unused-cli -DCMAKE_INSTALL_PREFIX:PATH=/tmp/pip-install-j8zwfqtv/llama-cpp-python_04eb6b87adaa484b896f199f0dced2de/_skbuild/linux-x86_64-3.10/cmake-install -DPYTHON_VERSION_STRING:STRING=3.10.6 -DSKBUILD:INTERNAL=TRUE -DCMAKE_MODULE_PATH:PATH=/tmp/pip-build-env-9hnvetry/overlay/local/lib/python3.10/dist-packages/skbuild/resources/cmake -DPYTHON_EXECUTABLE:PATH=/usr/bin/python3 -DPYTHON_INCLUDE_DIR:PATH=/usr/include/python3.10 -DPYTHON_LIBRARY:PATH=/usr/lib/x86_64-linux-gnu/libpython3.10.so -DPython_EXECUTABLE:PATH=/usr/bin/python3 -DPython_ROOT_DIR:PATH=/usr -DPython_FIND_REGISTRY:STRING=NEVER -DPython_INCLUDE_DIR:PATH=/usr/include/python3.10 -DPython3_EXECUTABLE:PATH=/usr/bin/python3 -DPython3_ROOT_DIR:PATH=/usr -DPython3_FIND_REGISTRY:STRING=NEVER -DPython3_INCLUDE_DIR:PATH=/usr/include/python3.10 -DCMAKE_MAKE_PROGRAM:FILEPATH=/tmp/pip-build-env-9hnvetry/overlay/local/lib/python3.10/dist-packages/ninja/data/bin/ninja -DCMAKE_BUILD_TYPE:STRING=Release
  
  Not searching for unused variables given on the command line.
  -- The C compiler identification is GNU 11.3.0
  -- The CXX compiler identification is GNU 11.3.0
  -- Detecting C compiler ABI info
  -- Detecting C compiler ABI info - done
  -- Check for working C compiler: /usr/bin/cc - skipped
  -- Detecting C compile features
  -- Detecting C compile features - done
  -- Detecting CXX compiler ABI info
  -- Detecting CXX compiler ABI info - done
  -- Check for working CXX compiler: /usr/bin/c++ - skipped
  -- Detecting CXX compile features
  -- Detecting CXX compile features - done
  -- Configuring done (0.3s)
  -- Generating done (0.0s)
  -- Build files have been written to: /tmp/pip-install-j8zwfqtv/llama-cpp-python_04eb6b87adaa484b896f199f0dced2de/_skbuild/linux-x86_64-3.10/cmake-build
  [1/2] Generating /tmp/pip-install-j8zwfqtv/llama-cpp-python_04eb6b87adaa484b896f199f0dced2de/vendor/llama.cpp/libllama.so
  FAILED: /tmp/pip-install-j8zwfqtv/llama-cpp-python_04eb6b87adaa484b896f199f0dced2de/vendor/llama.cpp/libllama.so
  cd /tmp/pip-install-j8zwfqtv/llama-cpp-python_04eb6b87adaa484b896f199f0dced2de/vendor/llama.cpp && make libllama.so
  I llama.cpp build info:
  I UNAME_S:  Linux
  I UNAME_P:  x86_64
  I UNAME_M:  x86_64
  I CFLAGS:   -I.              -O3 -std=c11   -fPIC -DNDEBUG -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -pthread -march=native -mtune=native -DGGML_USE_K_QUANTS
  I CXXFLAGS: -I. -I./examples -O3 -std=c++11 -fPIC -DNDEBUG -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -march=native -mtune=native
  I LDFLAGS:
  I CC:       cc (Ubuntu 11.3.0-1ubuntu1~22.04.1) 11.3.0
  I CXX:      g++ (Ubuntu 11.3.0-1ubuntu1~22.04.1) 11.3.0
  
  g++ -I. -I./examples -O3 -std=c++11 -fPIC -DNDEBUG -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -march=native -mtune=native -c llama.cpp -o llama.o
  cc  -I.              -O3 -std=c11   -fPIC -DNDEBUG -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -pthread -march=native -mtune=native -DGGML_USE_K_QUANTS   -c ggml.c -o ggml.o
  In file included from /usr/lib/gcc/x86_64-linux-gnu/11/include/immintrin.h:99,
                   from ggml.c:202:
  ggml.c: In function ‘ggml_vec_dot_q4_0_q8_0’:
  /usr/lib/gcc/x86_64-linux-gnu/11/include/fmaintrin.h:63:1: error: inlining failed in call to ‘always_inline’ ‘_mm256_fmadd_ps’: target specific option mismatch
     63 | _mm256_fmadd_ps (__m256 __A, __m256 __B, __m256 __C)
        | ^~~~~~~~~~~~~~~
  ggml.c:2324:15: note: called from here
   2324 |         acc = _mm256_fmadd_ps( d, q, acc );
        |               ^~~~~~~~~~~~~~~~~~~~~~~~~~~~
  In file included from /usr/lib/gcc/x86_64-linux-gnu/11/include/immintrin.h:99,
                   from ggml.c:202:
  /usr/lib/gcc/x86_64-linux-gnu/11/include/fmaintrin.h:63:1: error: inlining failed in call to ‘always_inline’ ‘_mm256_fmadd_ps’: target specific option mismatch
     63 | _mm256_fmadd_ps (__m256 __A, __m256 __B, __m256 __C)
        | ^~~~~~~~~~~~~~~
  ggml.c:2324:15: note: called from here
   2324 |         acc = _mm256_fmadd_ps( d, q, acc );
        |               ^~~~~~~~~~~~~~~~~~~~~~~~~~~~
  In file included from /usr/lib/gcc/x86_64-linux-gnu/11/include/immintrin.h:99,
                   from ggml.c:202:
  /usr/lib/gcc/x86_64-linux-gnu/11/include/fmaintrin.h:63:1: error: inlining failed in call to ‘always_inline’ ‘_mm256_fmadd_ps’: target specific option mismatch
     63 | _mm256_fmadd_ps (__m256 __A, __m256 __B, __m256 __C)
        | ^~~~~~~~~~~~~~~
  ggml.c:2324:15: note: called from here
   2324 |         acc = _mm256_fmadd_ps( d, q, acc );
        |               ^~~~~~~~~~~~~~~~~~~~~~~~~~~~
  In file included from /usr/lib/gcc/x86_64-linux-gnu/11/include/immintrin.h:99,
                   from ggml.c:202:
  /usr/lib/gcc/x86_64-linux-gnu/11/include/fmaintrin.h:63:1: error: inlining failed in call to ‘always_inline’ ‘_mm256_fmadd_ps’: target specific option mismatch
     63 | _mm256_fmadd_ps (__m256 __A, __m256 __B, __m256 __C)
        | ^~~~~~~~~~~~~~~
  ggml.c:2324:15: note: called from here
   2324 |         acc = _mm256_fmadd_ps( d, q, acc );
        |               ^~~~~~~~~~~~~~~~~~~~~~~~~~~~
  make: *** [Makefile:245: ggml.o] Error 1
  ninja: build stopped: subcommand failed.
  Traceback (most recent call last):
    File "/tmp/pip-build-env-9hnvetry/overlay/local/lib/python3.10/dist-packages/skbuild/setuptools_wrap.py", line 674, in setup
      cmkr.make(make_args, install_target=cmake_install_target, env=env)
    File "/tmp/pip-build-env-9hnvetry/overlay/local/lib/python3.10/dist-packages/skbuild/cmaker.py", line 697, in make
      self.make_impl(clargs=clargs, config=config, source_dir=source_dir, install_target=install_target, env=env)
    File "/tmp/pip-build-env-9hnvetry/overlay/local/lib/python3.10/dist-packages/skbuild/cmaker.py", line 742, in make_impl
      raise SKBuildError(msg)
  
  An error occurred while building with CMake.
    Command:
      /tmp/pip-build-env-9hnvetry/overlay/local/lib/python3.10/dist-packages/cmake/data/bin/cmake --build . --target install --config Release --
    Install target:
      install
    Source directory:
      /tmp/pip-install-j8zwfqtv/llama-cpp-python_04eb6b87adaa484b896f199f0dced2de
    Working directory:
      /tmp/pip-install-j8zwfqtv/llama-cpp-python_04eb6b87adaa484b896f199f0dced2de/_skbuild/linux-x86_64-3.10/cmake-build
  Please check the install target is valid and see CMake's output for more information.
  
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for llama-cpp-python Failed to build llama-cpp-python ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects

Thinkcore88 avatar Jun 10 '23 21:06 Thinkcore88

I have been getting the same issue with Ubuntu 22.04.2

kiwi-in-a-bag avatar Jun 11 '23 06:06 kiwi-in-a-bag

Work for me under win 10: https://tc.ht/PowerShell/AI/privategpt.ps1 After some update with some py packet.

Thinkcore88 avatar Jun 11 '23 10:06 Thinkcore88

There seems to be a bug with llama-cpp-python 0.1.60 because I had the same issue on Ubuntu 22.04 as well

Try llama-cpp-python==0.1.59 that's the latest I was able to use w/o issues

MsJamie avatar Jun 13 '23 10:06 MsJamie