llama-cpp-python icon indicating copy to clipboard operation
llama-cpp-python copied to clipboard

ERROR: Failed building wheel for llama-cpp-python

Open rishabhgupta93 opened this issue 1 year ago • 9 comments

Dear Experts,

I am trying to install llama-cpp-python in rhel 7.

I am getting following error:

*** CMake build failed [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for llama-cpp-python Failed to build llama-cpp-python ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects

CMake is already installed in the system.

Can someone please help!

rishabhgupta93 avatar Jan 24 '24 17:01 rishabhgupta93

Can you re-run with --verbose and paste output here.

abetlen avatar Jan 25 '24 15:01 abetlen

pip install llama-cpp-python --verbose Using pip 23.2.1 from /apps/workarea/PycharmProjects/llmsql/venv/lib/python3.9/site-packages/pip (python 3.9) Collecting llama-cpp-python Using cached llama_cpp_python-0.2.32.tar.gz (10.1 MB) Running command pip subprocess to install build dependencies Collecting scikit-build-core[pyproject]>=0.5.1 Obtaining dependency information for scikit-build-core[pyproject]>=0.5.1 from https://files.pythonhosted.org/packages/0c/5b/73dc7944ef0fdbe97626b40525f1f9ca2547d7c5229b358d45357ff62209/scikit_build_core-0.8.0-py3-none-any.whl.metadata Using cached scikit_build_core-0.8.0-py3-none-any.whl.metadata (19 kB) Collecting exceptiongroup (from scikit-build-core[pyproject]>=0.5.1) Obtaining dependency information for exceptiongroup from https://files.pythonhosted.org/packages/b8/9a/5028fd52db10e600f1c4674441b968cf2ea4959085bfb5b99fb1250e5f68/exceptiongroup-1.2.0-py3-none-any.whl.metadata Using cached exceptiongroup-1.2.0-py3-none-any.whl.metadata (6.6 kB) Collecting packaging>=20.9 (from scikit-build-core[pyproject]>=0.5.1) Obtaining dependency information for packaging>=20.9 from https://files.pythonhosted.org/packages/ec/1a/610693ac4ee14fcdf2d9bf3c493370e4f2ef7ae2e19217d7a237ff42367d/packaging-23.2-py3-none-any.whl.metadata Using cached packaging-23.2-py3-none-any.whl.metadata (3.2 kB) Collecting tomli>=1.1 (from scikit-build-core[pyproject]>=0.5.1) Using cached tomli-2.0.1-py3-none-any.whl (12 kB) Collecting pathspec>=0.10.1 (from scikit-build-core[pyproject]>=0.5.1) Obtaining dependency information for pathspec>=0.10.1 from https://files.pythonhosted.org/packages/cc/20/ff623b09d963f88bfde16306a54e12ee5ea43e9b597108672ff3a408aad6/pathspec-0.12.1-py3-none-any.whl.metadata Using cached pathspec-0.12.1-py3-none-any.whl.metadata (21 kB) Collecting pyproject-metadata>=0.5 (from scikit-build-core[pyproject]>=0.5.1) Using cached pyproject_metadata-0.7.1-py3-none-any.whl (7.4 kB) Using cached packaging-23.2-py3-none-any.whl (53 kB) Using cached pathspec-0.12.1-py3-none-any.whl (31 kB) Using cached exceptiongroup-1.2.0-py3-none-any.whl (16 kB) Using cached scikit_build_core-0.8.0-py3-none-any.whl (139 kB) Installing collected packages: tomli, pathspec, packaging, exceptiongroup, scikit-build-core, pyproject-metadata Successfully installed exceptiongroup-1.2.0 packaging-23.2 pathspec-0.12.1 pyproject-metadata-0.7.1 scikit-build-core-0.8.0 tomli-2.0.1

[notice] A new release of pip is available: 23.2.1 -> 23.3.2 [notice] To update, run: pip install --upgrade pip Installing build dependencies ... done Running command Getting requirements to build wheel Getting requirements to build wheel ... done Running command pip subprocess to install backend dependencies Collecting ninja>=1.5 Obtaining dependency information for ninja>=1.5 from https://files.pythonhosted.org/packages/6d/92/8d7aebd4430ab5ff65df2bfee6d5745f95c004284db2d8ca76dcbfd9de47/ninja-1.11.1.1-py2.py3-none-manylinux1_x86_64.manylinux_2_5_x86_64.whl.metadata Using cached ninja-1.11.1.1-py2.py3-none-manylinux1_x86_64.manylinux_2_5_x86_64.whl.metadata (5.3 kB) Collecting cmake>=3.21 Obtaining dependency information for cmake>=3.21 from https://files.pythonhosted.org/packages/09/30/df85689d18122becb9b6495cf6778f9ef629bdaa3ec86f49809ab5772e35/cmake-3.28.1-py2.py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.metadata Using cached cmake-3.28.1-py2.py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.metadata (6.3 kB) Using cached ninja-1.11.1.1-py2.py3-none-manylinux1_x86_64.manylinux_2_5_x86_64.whl (307 kB) Using cached cmake-3.28.1-py2.py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (26.3 MB) Installing collected packages: ninja, cmake Successfully installed cmake-3.28.1 ninja-1.11.1.1

[notice] A new release of pip is available: 23.2.1 -> 23.3.2 [notice] To update, run: pip install --upgrade pip Installing backend dependencies ... done Running command Preparing metadata (pyproject.toml) *** scikit-build-core 0.8.0 using CMake 3.28.1 (metadata_wheel) Preparing metadata (pyproject.toml) ... done Requirement already satisfied: typing-extensions>=4.5.0 in ./venv/lib/python3.9/site-packages (from llama-cpp-python) (4.9.0) Requirement already satisfied: numpy>=1.20.0 in ./venv/lib/python3.9/site-packages (from llama-cpp-python) (1.26.3) Collecting diskcache>=5.6.1 (from llama-cpp-python) Obtaining dependency information for diskcache>=5.6.1 from https://files.pythonhosted.org/packages/3f/27/4570e78fc0bf5ea0ca45eb1de3818a23787af9b390c0b0a0033a1b8236f9/diskcache-5.6.3-py3-none-any.whl.metadata Using cached diskcache-5.6.3-py3-none-any.whl.metadata (20 kB) Requirement already satisfied: jinja2>=2.11.3 in ./venv/lib/python3.9/site-packages (from llama-cpp-python) (3.1.3) Requirement already satisfied: MarkupSafe>=2.0 in ./venv/lib/python3.9/site-packages (from jinja2>=2.11.3->llama-cpp-python) (2.1.4) Using cached diskcache-5.6.3-py3-none-any.whl (45 kB) Building wheels for collected packages: llama-cpp-python Running command Building wheel for llama-cpp-python (pyproject.toml) *** scikit-build-core 0.8.0 using CMake 3.28.1 (wheel) *** Configuring CMake... loading initial cache file /tmp/tmpacuyrs2s/build/CMakeInit.txt -- The C compiler identification is GNU 4.8.5 -- The CXX compiler identification is GNU 4.8.5 -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working C compiler: /usr/bin/cc - skipped -- Detecting C compile features -- Detecting C compile features - done -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: /usr/bin/c++ - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Found Git: /usr/bin/git (found version "1.8.3.1") -- Performing Test CMAKE_HAVE_LIBC_PTHREAD -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed -- Check if compiler accepts -pthread -- Check if compiler accepts -pthread - yes -- Found Threads: TRUE -- Warning: ccache not found - consider installing it or use LLAMA_CCACHE=OFF -- CMAKE_SYSTEM_PROCESSOR: x86_64 -- x86 detected CMake Warning (dev) at CMakeLists.txt:21 (install): Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION. This warning is for project developers. Use -Wno-dev to suppress it.

CMake Warning (dev) at CMakeLists.txt:30 (install): Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION. This warning is for project developers. Use -Wno-dev to suppress it.

-- Configuring done (0.4s) -- Generating done (0.0s) -- Build files have been written to: /tmp/tmpacuyrs2s/build *** Building project with Ninja... Change Dir: '/tmp/tmpacuyrs2s/build'

Run Build Command(s): /tmp/pip-build-env-iclrry2i/normal/lib/python3.9/site-packages/ninja/data/bin/ninja -v [1/22] cd /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp && /tmp/pip-build-env-iclrry2i/normal/lib/python3.9/site-packages/cmake/data/bin/cmake -DMSVC= -DCMAKE_C_COMPILER_VERSION=4.8.5 -DCMAKE_C_COMPILER_ID=GNU -DCMAKE_VS_PLATFORM_NAME= -DCMAKE_C_COMPILER=/usr/bin/cc -P /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/../scripts/gen-build-info-cpp.cmake -- Found Git: /usr/bin/git (found version "1.8.3.1") [2/22] /usr/bin/c++ -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -O3 -DNDEBUG -std=gnu++11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wmissing-declarations -Wmissing-noreturn -Wno-array-bounds -march=native -MD -MT vendor/llama.cpp/common/CMakeFiles/build_info.dir/build-info.cpp.o -MF vendor/llama.cpp/common/CMakeFiles/build_info.dir/build-info.cpp.o.d -o vendor/llama.cpp/common/CMakeFiles/build_info.dir/build-info.cpp.o -c /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/build-info.cpp [3/22] /usr/bin/cc -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wdouble-promotion -march=native -pthread -MD -MT vendor/llama.cpp/CMakeFiles/ggml.dir/ggml.c.o -MF vendor/llama.cpp/CMakeFiles/ggml.dir/ggml.c.o.d -o vendor/llama.cpp/CMakeFiles/ggml.dir/ggml.c.o -c /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/ggml.c FAILED: vendor/llama.cpp/CMakeFiles/ggml.dir/ggml.c.o /usr/bin/cc -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wdouble-promotion -march=native -pthread -MD -MT vendor/llama.cpp/CMakeFiles/ggml.dir/ggml.c.o -MF vendor/llama.cpp/CMakeFiles/ggml.dir/ggml.c.o.d -o vendor/llama.cpp/CMakeFiles/ggml.dir/ggml.c.o -c /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/ggml.c /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/ggml.c:89:23: fatal error: stdatomic.h: No such file or directory #include <stdatomic.h> ^ compilation terminated. [4/22] /usr/bin/cc -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wdouble-promotion -march=native -pthread -MD -MT vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-alloc.c.o -MF vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-alloc.c.o.d -o vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-alloc.c.o -c /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/ggml-alloc.c [5/22] /usr/bin/c++ -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/. -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wmissing-declarations -Wmissing-noreturn -Wno-array-bounds -march=native -MD -MT vendor/llama.cpp/common/CMakeFiles/common.dir/console.cpp.o -MF vendor/llama.cpp/common/CMakeFiles/common.dir/console.cpp.o.d -o vendor/llama.cpp/common/CMakeFiles/common.dir/console.cpp.o -c /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/console.cpp [6/22] /usr/bin/c++ -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/. -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/. -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/. -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/../.. -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/../../common -O3 -DNDEBUG -std=gnu++11 -MD -MT vendor/llama.cpp/examples/llava/CMakeFiles/llava-cli.dir/llava-cli.cpp.o -MF vendor/llama.cpp/examples/llava/CMakeFiles/llava-cli.dir/llava-cli.cpp.o.d -o vendor/llama.cpp/examples/llava/CMakeFiles/llava-cli.dir/llava-cli.cpp.o -c /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/llava-cli.cpp FAILED: vendor/llama.cpp/examples/llava/CMakeFiles/llava-cli.dir/llava-cli.cpp.o /usr/bin/c++ -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/. -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/. -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/. -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/../.. -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/../../common -O3 -DNDEBUG -std=gnu++11 -MD -MT vendor/llama.cpp/examples/llava/CMakeFiles/llava-cli.dir/llava-cli.cpp.o -MF vendor/llama.cpp/examples/llava/CMakeFiles/llava-cli.dir/llava-cli.cpp.o.d -o vendor/llama.cpp/examples/llava/CMakeFiles/llava-cli.dir/llava-cli.cpp.o -c /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/llava-cli.cpp In file included from /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/llava-cli.cpp:7:0: /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/./base64.hpp: In static member function ?static Output_iterator base64::encode(Input_iterator, Input_iterator, Output_iterator, base64::alphabet)?: /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/./base64.hpp:84:42: error: ?alphabet? is not a class, namespace, or enumeration const char* alpha = alphabet == alphabet::url_filename_safe ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/./base64.hpp: In static member function ?static uint8_t base64::_base64_value(base64::alphabet&, char)?: /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/./base64.hpp:355:25: error: ?alphabet? is not a class, namespace, or enumeration if (alphabet == alphabet::standard) { ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/./base64.hpp:361:32: error: ?alphabet? is not a class, namespace, or enumeration } else if (alphabet == alphabet::url_filename_safe) { ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/./base64.hpp:370:28: error: ?alphabet? is not a class, namespace, or enumeration alphabet = alphabet::standard; ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/./base64.hpp:374:28: error: ?alphabet? is not a class, namespace, or enumeration alphabet = alphabet::standard; ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/./base64.hpp:378:28: error: ?alphabet? is not a class, namespace, or enumeration alphabet = alphabet::url_filename_safe; ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/./base64.hpp:382:28: error: ?alphabet? is not a class, namespace, or enumeration alphabet = alphabet::url_filename_safe; ^ [7/22] /usr/bin/c++ -DLLAMA_BUILD -DLLAMA_SHARED -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/. -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/../.. -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/../../common -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wno-cast-qual -pthread -MD -MT vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o -MF vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o.d -o vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o -c /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/llava.cpp FAILED: vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o /usr/bin/c++ -DLLAMA_BUILD -DLLAMA_SHARED -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/. -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/../.. -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/../../common -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wno-cast-qual -pthread -MD -MT vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o -MF vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o.d -o vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o -c /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/llava.cpp In file included from /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/llava.cpp:10:0: /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/../../common/base64.hpp: In static member function ?static Output_iterator base64::encode(Input_iterator, Input_iterator, Output_iterator, base64::alphabet)?: /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/../../common/base64.hpp:84:42: error: ?alphabet? is not a class, namespace, or enumeration const char* alpha = alphabet == alphabet::url_filename_safe ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/../../common/base64.hpp: In static member function ?static uint8_t base64::_base64_value(base64::alphabet&, char)?: /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/../../common/base64.hpp:355:25: error: ?alphabet? is not a class, namespace, or enumeration if (alphabet == alphabet::standard) { ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/../../common/base64.hpp:361:32: error: ?alphabet? is not a class, namespace, or enumeration } else if (alphabet == alphabet::url_filename_safe) { ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/../../common/base64.hpp:370:28: error: ?alphabet? is not a class, namespace, or enumeration alphabet = alphabet::standard; ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/../../common/base64.hpp:374:28: error: ?alphabet? is not a class, namespace, or enumeration alphabet = alphabet::standard; ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/../../common/base64.hpp:378:28: error: ?alphabet? is not a class, namespace, or enumeration alphabet = alphabet::url_filename_safe; ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/../../common/base64.hpp:382:28: error: ?alphabet? is not a class, namespace, or enumeration alphabet = alphabet::url_filename_safe; ^ [8/22] /usr/bin/cc -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wdouble-promotion -march=native -pthread -MD -MT vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-backend.c.o -MF vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-backend.c.o.d -o vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-backend.c.o -c /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/ggml-backend.c [9/22] /usr/bin/c++ -DLLAMA_BUILD -DLLAMA_SHARED -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/. -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/../.. -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/../../common -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wno-cast-qual -pthread -MD -MT vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o -MF vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o.d -o vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o -c /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/clip.cpp FAILED: vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o /usr/bin/c++ -DLLAMA_BUILD -DLLAMA_SHARED -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/. -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/../.. -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/../../common -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wno-cast-qual -pthread -MD -MT vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o -MF vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o.d -o vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o -c /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/clip.cpp /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/clip.cpp: In function ?clip_ctx* clip_model_load(const char*, int)?: /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/clip.cpp:583:57: error: use of deleted function ?std::basic_ifstream::basic_ifstream(const std::basic_ifstream&)? auto fin = std::ifstream(fname, std::ios::binary); ^ In file included from /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/clip.cpp:9:0: /usr/include/c++/4.8.2/fstream:427:11: note: ?std::basic_ifstream::basic_ifstream(const std::basic_ifstream&)? is implicitly deleted because the default definition would be ill-formed: class basic_ifstream : public basic_istream<_CharT, _Traits> ^ /usr/include/c++/4.8.2/fstream:427:11: error: use of deleted function ?std::basic_istream::basic_istream(const std::basic_istream&)? In file included from /usr/include/c++/4.8.2/fstream:38:0, from /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/clip.cpp:9: /usr/include/c++/4.8.2/istream:58:11: note: ?std::basic_istream::basic_istream(const std::basic_istream&)? is implicitly deleted because the default definition would be ill-formed: class basic_istream : virtual public basic_ios<_CharT, _Traits> ^ /usr/include/c++/4.8.2/istream:58:11: error: use of deleted function ?std::basic_ios::basic_ios(const std::basic_ios&)? In file included from /usr/include/c++/4.8.2/ios:44:0, from /usr/include/c++/4.8.2/istream:38, from /usr/include/c++/4.8.2/fstream:38, from /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/clip.cpp:9: /usr/include/c++/4.8.2/bits/basic_ios.h:66:11: note: ?std::basic_ios::basic_ios(const std::basic_ios&)? is implicitly deleted because the default definition would be ill-formed: class basic_ios : public ios_base ^ In file included from /usr/include/c++/4.8.2/ios:42:0, from /usr/include/c++/4.8.2/istream:38, from /usr/include/c++/4.8.2/fstream:38, from /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/clip.cpp:9: /usr/include/c++/4.8.2/bits/ios_base.h:786:5: error: ?std::ios_base::ios_base(const std::ios_base&)? is private ios_base(const ios_base&); ^ In file included from /usr/include/c++/4.8.2/ios:44:0, from /usr/include/c++/4.8.2/istream:38, from /usr/include/c++/4.8.2/fstream:38, from /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/clip.cpp:9: /usr/include/c++/4.8.2/bits/basic_ios.h:66:11: error: within this context class basic_ios : public ios_base ^ In file included from /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/clip.cpp:9:0: /usr/include/c++/4.8.2/fstream:427:11: error: use of deleted function ?std::basic_ios::basic_ios(const std::basic_ios&)? class basic_ifstream : public basic_istream<_CharT, _Traits> ^ /usr/include/c++/4.8.2/fstream:427:11: error: use of deleted function ?std::basic_filebuf::basic_filebuf(const std::basic_filebuf&)? /usr/include/c++/4.8.2/fstream:72:11: note: ?std::basic_filebuf::basic_filebuf(const std::basic_filebuf&)? is implicitly deleted because the default definition would be ill-formed: class basic_filebuf : public basic_streambuf<_CharT, _Traits> ^ In file included from /usr/include/c++/4.8.2/ios:43:0, from /usr/include/c++/4.8.2/istream:38, from /usr/include/c++/4.8.2/fstream:38, from /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/clip.cpp:9: /usr/include/c++/4.8.2/streambuf:802:7: error: ?std::basic_streambuf<_CharT, _Traits>::basic_streambuf(const std::basic_streambuf<_CharT, _Traits>&) [with _CharT = char; _Traits = std::char_traits]? is private basic_streambuf(const basic_streambuf& __sb) ^ In file included from /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/clip.cpp:9:0: /usr/include/c++/4.8.2/fstream:72:11: error: within this context class basic_filebuf : public basic_streambuf<_CharT, _Traits> ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/clip.cpp: In function ?bool clip_model_quantize(const char*, const char*, int)?: /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/clip.cpp:932:58: error: use of deleted function ?std::basic_ofstream::basic_ofstream(const std::basic_ofstream&)? auto fout = std::ofstream(fname_out, std::ios::binary); ^ In file included from /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/clip.cpp:9:0: /usr/include/c++/4.8.2/fstream:599:11: note: ?std::basic_ofstream::basic_ofstream(const std::basic_ofstream&)? is implicitly deleted because the default definition would be ill-formed: class basic_ofstream : public basic_ostream<_CharT,_Traits> ^ /usr/include/c++/4.8.2/fstream:599:11: error: use of deleted function ?std::basic_ostream::basic_ostream(const std::basic_ostream&)? In file included from /usr/include/c++/4.8.2/istream:39:0, from /usr/include/c++/4.8.2/fstream:38, from /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/clip.cpp:9: /usr/include/c++/4.8.2/ostream:58:11: note: ?std::basic_ostream::basic_ostream(const std::basic_ostream&)? is implicitly deleted because the default definition would be ill-formed: class basic_ostream : virtual public basic_ios<_CharT, _Traits> ^ /usr/include/c++/4.8.2/ostream:58:11: error: use of deleted function ?std::basic_ios::basic_ios(const std::basic_ios&)? In file included from /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/examples/llava/clip.cpp:9:0: /usr/include/c++/4.8.2/fstream:599:11: error: use of deleted function ?std::basic_ios::basic_ios(const std::basic_ios&)? class basic_ofstream : public basic_ostream<_CharT,_Traits> ^ /usr/include/c++/4.8.2/fstream:599:11: error: use of deleted function ?std::basic_filebuf::basic_filebuf(const std::basic_filebuf&)? [10/22] /usr/bin/c++ -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/. -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wmissing-declarations -Wmissing-noreturn -Wno-array-bounds -march=native -MD -MT vendor/llama.cpp/common/CMakeFiles/common.dir/grammar-parser.cpp.o -MF vendor/llama.cpp/common/CMakeFiles/common.dir/grammar-parser.cpp.o.d -o vendor/llama.cpp/common/CMakeFiles/common.dir/grammar-parser.cpp.o -c /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/grammar-parser.cpp [11/22] /usr/bin/c++ -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/. -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wmissing-declarations -Wmissing-noreturn -Wno-array-bounds -march=native -MD -MT vendor/llama.cpp/common/CMakeFiles/common.dir/common.cpp.o -MF vendor/llama.cpp/common/CMakeFiles/common.dir/common.cpp.o.d -o vendor/llama.cpp/common/CMakeFiles/common.dir/common.cpp.o -c /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp FAILED: vendor/llama.cpp/common/CMakeFiles/common.dir/common.cpp.o /usr/bin/c++ -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/. -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wmissing-declarations -Wmissing-noreturn -Wno-array-bounds -march=native -MD -MT vendor/llama.cpp/common/CMakeFiles/common.dir/common.cpp.o -MF vendor/llama.cpp/common/CMakeFiles/common.dir/common.cpp.o.d -o vendor/llama.cpp/common/CMakeFiles/common.dir/common.cpp.o -c /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp: In function ?bool gpt_params_parse_ex(int, char**, gpt_params&)?: /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp:223:50: warning: cast from type ?const char*? to type ?char*? casts away qualifiers [-Wcast-qual] file.read((char )params.prompt.data(), size); ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp: In function ?void dump_string_yaml_multiline(FILE, const char*, const char*)?: /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp:1461:72: error: no matching function for call to ?regex_replace(std::string&, std::regex, const char [3])? data_str = std::regex_replace(data_str, std::regex("\n"), "\n"); ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp:1461:72: note: candidates are: In file included from /usr/include/c++/4.8.2/regex:62:0, from /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp:12: /usr/include/c++/4.8.2/bits/regex.h:2162:5: note: template<class _Out_iter, class _Bi_iter, class _Rx_traits, class _Ch_type> _Out_iter std::regex_replace(_Out_iter, _Bi_iter, _Bi_iter, const std::basic_regex<_Ch_type, _Rx_traits>&, const std::basic_string<_Ch_type>&, std::regex_constants::match_flag_type) regex_replace(_Out_iter __out, _Bi_iter __first, _Bi_iter __last, ^ /usr/include/c++/4.8.2/bits/regex.h:2162:5: note: template argument deduction/substitution failed: /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp:1461:72: note: deduced conflicting types for parameter ?_Bi_iter? (?std::basic_regex? and ?const char*?) data_str = std::regex_replace(data_str, std::regex("\n"), "\n"); ^ In file included from /usr/include/c++/4.8.2/regex:62:0, from /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp:12: /usr/include/c++/4.8.2/bits/regex.h:2182:5: note: template<class _Rx_traits, class _Ch_type> std::basic_string<_Ch_type> std::regex_replace(const std::basic_string<_Ch_type>&, const std::basic_regex<_Ch_type, _Rx_traits>&, const std::basic_string<_Ch_type>&, std::regex_constants::match_flag_type) regex_replace(const basic_string<_Ch_type>& __s, ^ /usr/include/c++/4.8.2/bits/regex.h:2182:5: note: template argument deduction/substitution failed: /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp:1461:72: note: mismatched types ?const std::basic_string<_Ch_type>? and ?const char [3]? data_str = std::regex_replace(data_str, std::regex("\n"), "\n"); ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp:1462:73: error: no matching function for call to ?regex_replace(std::string&, std::regex, const char [3])? data_str = std::regex_replace(data_str, std::regex("""), "\""); ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp:1462:73: note: candidates are: In file included from /usr/include/c++/4.8.2/regex:62:0, from /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp:12: /usr/include/c++/4.8.2/bits/regex.h:2162:5: note: template<class _Out_iter, class _Bi_iter, class _Rx_traits, class _Ch_type> _Out_iter std::regex_replace(_Out_iter, _Bi_iter, _Bi_iter, const std::basic_regex<_Ch_type, _Rx_traits>&, const std::basic_string<_Ch_type>&, std::regex_constants::match_flag_type) regex_replace(_Out_iter __out, _Bi_iter __first, _Bi_iter __last, ^ /usr/include/c++/4.8.2/bits/regex.h:2162:5: note: template argument deduction/substitution failed: /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp:1462:73: note: deduced conflicting types for parameter ?_Bi_iter? (?std::basic_regex? and ?const char*?) data_str = std::regex_replace(data_str, std::regex("""), "\""); ^ In file included from /usr/include/c++/4.8.2/regex:62:0, from /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp:12: /usr/include/c++/4.8.2/bits/regex.h:2182:5: note: template<class _Rx_traits, class _Ch_type> std::basic_string<_Ch_type> std::regex_replace(const std::basic_string<_Ch_type>&, const std::basic_regex<_Ch_type, _Rx_traits>&, const std::basic_string<_Ch_type>&, std::regex_constants::match_flag_type) regex_replace(const basic_string<_Ch_type>& __s, ^ /usr/include/c++/4.8.2/bits/regex.h:2182:5: note: template argument deduction/substitution failed: /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp:1462:73: note: mismatched types ?const std::basic_string<_Ch_type>? and ?const char [3]? data_str = std::regex_replace(data_str, std::regex("""), "\""); ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp:1463:83: error: no matching function for call to ?regex_replace(std::string&, std::regex, const char [4])? data_str = std::regex_replace(data_str, std::regex(R"(\[^n"])"), R"($&)"); ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp:1463:83: note: candidates are: In file included from /usr/include/c++/4.8.2/regex:62:0, from /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp:12: /usr/include/c++/4.8.2/bits/regex.h:2162:5: note: template<class _Out_iter, class _Bi_iter, class _Rx_traits, class _Ch_type> _Out_iter std::regex_replace(_Out_iter, _Bi_iter, _Bi_iter, const std::basic_regex<_Ch_type, _Rx_traits>&, const std::basic_string<_Ch_type>&, std::regex_constants::match_flag_type) regex_replace(_Out_iter __out, _Bi_iter __first, _Bi_iter __last, ^ /usr/include/c++/4.8.2/bits/regex.h:2162:5: note: template argument deduction/substitution failed: /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp:1463:83: note: deduced conflicting types for parameter ?_Bi_iter? (?std::basic_regex? and ?const char*?) data_str = std::regex_replace(data_str, std::regex(R"(\[^n"])"), R"($&)"); ^ In file included from /usr/include/c++/4.8.2/regex:62:0, from /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp:12: /usr/include/c++/4.8.2/bits/regex.h:2182:5: note: template<class _Rx_traits, class _Ch_type> std::basic_string<_Ch_type> std::regex_replace(const std::basic_string<_Ch_type>&, const std::basic_regex<_Ch_type, _Rx_traits>&, const std::basic_string<_Ch_type>&, std::regex_constants::match_flag_type) regex_replace(const basic_string<_Ch_type>& __s, ^ /usr/include/c++/4.8.2/bits/regex.h:2182:5: note: template argument deduction/substitution failed: /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/common.cpp:1463:83: note: mismatched types ?const std::basic_string<_Ch_type>? and ?const char [4]? data_str = std::regex_replace(data_str, std::regex(R"(\[^n"])"), R"($&)"); ^ [12/22] /usr/bin/c++ -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/. -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wmissing-declarations -Wmissing-noreturn -Wno-array-bounds -march=native -MD -MT vendor/llama.cpp/common/CMakeFiles/common.dir/sampling.cpp.o -MF vendor/llama.cpp/common/CMakeFiles/common.dir/sampling.cpp.o.d -o vendor/llama.cpp/common/CMakeFiles/common.dir/sampling.cpp.o -c /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/sampling.cpp [13/22] /usr/bin/c++ -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/. -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wmissing-declarations -Wmissing-noreturn -Wno-array-bounds -march=native -MD -MT vendor/llama.cpp/common/CMakeFiles/common.dir/train.cpp.o -MF vendor/llama.cpp/common/CMakeFiles/common.dir/train.cpp.o.d -o vendor/llama.cpp/common/CMakeFiles/common.dir/train.cpp.o -c /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/common/train.cpp [14/22] /usr/bin/cc -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wdouble-promotion -march=native -pthread -MD -MT vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-quants.c.o -MF vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-quants.c.o.d -o vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-quants.c.o -c /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/ggml-quants.c [15/22] /usr/bin/c++ -DLLAMA_BUILD -DLLAMA_SHARED -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -Dllama_EXPORTS -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wmissing-declarations -Wmissing-noreturn -Wno-array-bounds -march=native -pthread -MD -MT vendor/llama.cpp/CMakeFiles/llama.dir/llama.cpp.o -MF vendor/llama.cpp/CMakeFiles/llama.dir/llama.cpp.o.d -o vendor/llama.cpp/CMakeFiles/llama.dir/llama.cpp.o -c /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/llama.cpp FAILED: vendor/llama.cpp/CMakeFiles/llama.dir/llama.cpp.o /usr/bin/c++ -DLLAMA_BUILD -DLLAMA_SHARED -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -Dllama_EXPORTS -I/tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wmissing-declarations -Wmissing-noreturn -Wno-array-bounds -march=native -pthread -MD -MT vendor/llama.cpp/CMakeFiles/llama.dir/llama.cpp.o -MF vendor/llama.cpp/CMakeFiles/llama.dir/llama.cpp.o.d -o vendor/llama.cpp/CMakeFiles/llama.dir/llama.cpp.o -c /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/llama.cpp /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/llama.cpp:1227:10: warning: unused parameter ?addr? [-Wunused-parameter] bool raw_lock(const void * addr, size_t len) const { ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/llama.cpp:1227:10: warning: unused parameter ?len? [-Wunused-parameter] /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/llama.cpp:1232:17: warning: unused parameter ?addr? [-Wunused-parameter] static void raw_unlock(const void * addr, size_t len) {} ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/llama.cpp:1232:17: warning: unused parameter ?len? [-Wunused-parameter] /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/llama.cpp:6882:15: error: ?is_trivially_copyable? is not a member of ?std? static_assert(std::is_trivially_copyable<llm_symbol>::value, "llm_symbol is not trivially copyable"); ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/llama.cpp:6882:52: error: expected primary-expression before ?>? token static_assert(std::is_trivially_copyable<llm_symbol>::value, "llm_symbol is not trivially copyable"); ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/llama.cpp:6882:53: error: ?::value? has not been declared static_assert(std::is_trivially_copyable<llm_symbol>::value, "llm_symbol is not trivially copyable"); ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/llama.cpp: In function ?llama_grammar* llama_grammar_init(const llama_grammar_element**, size_t, size_t)?: /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/llama.cpp:7906:75: warning: missing initializer for member ?llama_partial_utf8::value? [-Wmissing-field-initializers] return new llama_grammar{ std::move(vec_rules), std::move(stacks), {} }; ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/llama.cpp:7906:75: warning: missing initializer for member ?llama_partial_utf8::n_remain? [-Wmissing-field-initializers] /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/llama.cpp: In function ?void llama_model_quantize_internal(const string&, const string&, const llama_model_quantize_params*)?: /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/llama.cpp:9236:53: warning: missing initializer for member ?std::array<long int, 16ul>::_M_elems? [-Wmissing-field-initializers] std::array<int64_t, 1 << 4> hist_cur = {}; ^ /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/llama.cpp: In lambda function: /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d/vendor/llama.cpp/llama.cpp:9253:63: warning: missing initializer for member ?std::array<long int, 16ul>::_M_elems? [-Wmissing-field-initializers] std::array<int64_t, 1 << 4> local_hist = {}; ^ ninja: build stopped: subcommand failed.

*** CMake build failed error: subprocess-exited-with-error

× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully. ? exit code: 1 ??> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip. full command: /apps/workarea/PycharmProjects/llmsql/venv/bin/python /apps/workarea/PycharmProjects/llmsql/venv/lib/python3.9/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py build_wheel /tmp/tmp_frsv_ww cwd: /tmp/pip-install-ks5_8k9q/llama-cpp-python_e6a4ed853e294557a0a1e289122a6b2d Building wheel for llama-cpp-python (pyproject.toml) ... error ERROR: Failed building wheel for llama-cpp-python Failed to build llama-cpp-python ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects

rishabhgupta93 avatar Jan 25 '24 16:01 rishabhgupta93

I'm getting a similar error on my M1 Mac when running CMAKE_ARGS="-DLLAMA_METAL=on" pip install llama-cpp-python or even just pip install llama-cpp-python. I've tried on every 0.2.x version and even 0.1.85.

michael-long88 avatar Feb 01 '24 02:02 michael-long88

gcc 版本需要升级到 11 版本,升级好就可以正常安装了

Gaozizhong avatar Feb 01 '24 07:02 Gaozizhong

Thanks @Gaozizhong for pointing me in the right direction. Had to uninstall and reinstall Xcode on my M1 and that seemed to have done the trick.

michael-long88 avatar Feb 02 '24 01:02 michael-long88

this did the job for me

hamza233 avatar Feb 08 '24 20:02 hamza233

I met same problem. Anyone help?

pip install llama-cpp-python --verbose

Collecting llama-cpp-python Using cached llama_cpp_python-0.2.51.tar.gz (36.7 MB) Running command pip subprocess to install build dependencies Collecting scikit-build-core>=0.5.1 (from scikit-build-core[pyproject]>=0.5.1) Using cached scikit_build_core-0.8.1-py3-none-any.whl.metadata (19 kB) Collecting exceptiongroup (from scikit-build-core>=0.5.1->scikit-build-core[pyproject]>=0.5.1) Using cached exceptiongroup-1.2.0-py3-none-any.whl.metadata (6.6 kB) Collecting importlib-resources>=1.3 (from scikit-build-core>=0.5.1->scikit-build-core[pyproject]>=0.5.1) Using cached importlib_resources-6.1.2-py3-none-any.whl.metadata (3.9 kB) Collecting packaging>=20.9 (from scikit-build-core>=0.5.1->scikit-build-core[pyproject]>=0.5.1) Using cached packaging-23.2-py3-none-any.whl.metadata (3.2 kB) Collecting tomli>=1.1 (from scikit-build-core>=0.5.1->scikit-build-core[pyproject]>=0.5.1) Using cached tomli-2.0.1-py3-none-any.whl.metadata (8.9 kB) Collecting typing-extensions>=3.10.0 (from scikit-build-core>=0.5.1->scikit-build-core[pyproject]>=0.5.1) Using cached typing_extensions-4.10.0-py3-none-any.whl.metadata (3.0 kB) Collecting pathspec>=0.10.1 (from scikit-build-core[pyproject]>=0.5.1) Using cached pathspec-0.12.1-py3-none-any.whl.metadata (21 kB) Collecting pyproject-metadata>=0.5 (from scikit-build-core[pyproject]>=0.5.1) Using cached pyproject_metadata-0.7.1-py3-none-any.whl.metadata (3.0 kB) Collecting zipp>=3.1.0 (from importlib-resources>=1.3->scikit-build-core>=0.5.1->scikit-build-core[pyproject]>=0.5.1) Using cached zipp-3.17.0-py3-none-any.whl.metadata (3.7 kB) Using cached scikit_build_core-0.8.1-py3-none-any.whl (139 kB) Using cached importlib_resources-6.1.2-py3-none-any.whl (34 kB) Using cached packaging-23.2-py3-none-any.whl (53 kB) Using cached pathspec-0.12.1-py3-none-any.whl (31 kB) Using cached pyproject_metadata-0.7.1-py3-none-any.whl (7.4 kB) Using cached tomli-2.0.1-py3-none-any.whl (12 kB) Using cached typing_extensions-4.10.0-py3-none-any.whl (33 kB) Using cached exceptiongroup-1.2.0-py3-none-any.whl (16 kB) Using cached zipp-3.17.0-py3-none-any.whl (7.4 kB) Installing collected packages: zipp, typing-extensions, tomli, pathspec, packaging, exceptiongroup, pyproject-metadata, importlib-resources, scikit-build-core Successfully installed exceptiongroup-1.2.0 importlib-resources-6.1.2 packaging-23.2 pathspec-0.12.1 pyproject-metadata-0.7.1 scikit-build-core-0.8.1 tomli-2.0.1 typing-extensions-4.10.0 zipp-3.17.0 Installing build dependencies ... done Running command Getting requirements to build wheel Getting requirements to build wheel ... done Running command pip subprocess to install backend dependencies Collecting cmake>=3.21 Using cached cmake-3.28.3-py2.py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.metadata (6.3 kB) Collecting ninja>=1.5 Using cached ninja-1.11.1.1-py2.py3-none-manylinux1_x86_64.manylinux_2_5_x86_64.whl.metadata (5.3 kB) Using cached cmake-3.28.3-py2.py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (26.3 MB) Using cached ninja-1.11.1.1-py2.py3-none-manylinux1_x86_64.manylinux_2_5_x86_64.whl (307 kB) Installing collected packages: ninja, cmake Successfully installed cmake-3.28.3 ninja-1.11.1.1 Installing backend dependencies ... done Running command Preparing metadata (pyproject.toml) *** scikit-build-core 0.8.1 using CMake 3.28.3 (metadata_wheel) Preparing metadata (pyproject.toml) ... done Requirement already satisfied: typing-extensions>=4.5.0 in /mnt/nas/home/zhoushengyang/anaconda3/envs/rllm-2/lib/python3.8/site-packages (from llama-cpp-python) (4.10.0) Requirement already satisfied: numpy>=1.20.0 in /mnt/nas/home/zhoushengyang/anaconda3/envs/rllm-2/lib/python3.8/site-packages (from llama-cpp-python) (1.24.4) Collecting diskcache>=5.6.1 (from llama-cpp-python) Obtaining dependency information for diskcache>=5.6.1 from https://files.pythonhosted.org/packages/3f/27/4570e78fc0bf5ea0ca45eb1de3818a23787af9b390c0b0a0033a1b8236f9/diskcache-5.6.3-py3-none-any.whl.metadata Using cached diskcache-5.6.3-py3-none-any.whl.metadata (20 kB) Collecting jinja2>=2.11.3 (from llama-cpp-python) Obtaining dependency information for jinja2>=2.11.3 from https://files.pythonhosted.org/packages/30/6d/6de6be2d02603ab56e72997708809e8a5b0fbfee080735109b40a3564843/Jinja2-3.1.3-py3-none-any.whl.metadata Using cached Jinja2-3.1.3-py3-none-any.whl.metadata (3.3 kB) Collecting MarkupSafe>=2.0 (from jinja2>=2.11.3->llama-cpp-python) Obtaining dependency information for MarkupSafe>=2.0 from https://files.pythonhosted.org/packages/c7/bd/50319665ce81bb10e90d1cf76f9e1aa269ea6f7fa30ab4521f14d122a3df/MarkupSafe-2.1.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata Using cached MarkupSafe-2.1.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (3.0 kB) Using cached diskcache-5.6.3-py3-none-any.whl (45 kB) Using cached Jinja2-3.1.3-py3-none-any.whl (133 kB) Using cached MarkupSafe-2.1.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (26 kB) Building wheels for collected packages: llama-cpp-python Running command Building wheel for llama-cpp-python (pyproject.toml) *** scikit-build-core 0.8.1 using CMake 3.28.3 (wheel) *** Configuring CMake... loading initial cache file /tmp/tmpsosgfisd/build/CMakeInit.txt -- The C compiler identification is GNU 5.5.0 -- The CXX compiler identification is GNU 5.5.0 -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working C compiler: /usr/bin/cc - skipped -- Detecting C compile features -- Detecting C compile features - done -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: /usr/bin/c++ - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Found Git: /usr/bin/git (found version "2.7.4") -- Performing Test CMAKE_HAVE_LIBC_PTHREAD -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed -- Check if compiler accepts -pthread -- Check if compiler accepts -pthread - yes -- Found Threads: TRUE -- Warning: ccache not found - consider installing it for faster compilation or disable this warning with LLAMA_CCACHE=OFF -- CMAKE_SYSTEM_PROCESSOR: x86_64 -- x86 detected CMake Warning (dev) at CMakeLists.txt:21 (install): Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION. This warning is for project developers. Use -Wno-dev to suppress it.

CMake Warning (dev) at CMakeLists.txt:30 (install): Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION. This warning is for project developers. Use -Wno-dev to suppress it.

-- Configuring done (0.8s) -- Generating done (0.0s) -- Build files have been written to: /tmp/tmpsosgfisd/build *** Building project with Ninja... Change Dir: '/tmp/tmpsosgfisd/build'

Run Build Command(s): /tmp/pip-build-env-6xh7v2tz/normal/lib/python3.8/site-packages/ninja/data/bin/ninja -v [1/22] cd /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp && /tmp/pip-build-env-6xh7v2tz/normal/lib/python3.8/site-packages/cmake/data/bin/cmake -DMSVC= -DCMAKE_C_COMPILER_VERSION=5.5.0 -DCMAKE_C_COMPILER_ID=GNU -DCMAKE_VS_PLATFORM_NAME= -DCMAKE_C_COMPILER=/usr/bin/cc -P /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/../scripts/gen-build-info-cpp.cmake -- Found Git: /usr/bin/git (found version "2.7.4") [2/22] /usr/bin/c++ -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -O3 -DNDEBUG -std=gnu++11 -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-array-bounds -march=native -MD -MT vendor/llama.cpp/common/CMakeFiles/build_info.dir/build-info.cpp.o -MF vendor/llama.cpp/common/CMakeFiles/build_info.dir/build-info.cpp.o.d -o vendor/llama.cpp/common/CMakeFiles/build_info.dir/build-info.cpp.o -c /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/build-info.cpp [3/22] /usr/bin/cc -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -march=native -pthread -MD -MT vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-quants.c.o -MF vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-quants.c.o.d -o vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-quants.c.o -c /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/ggml-quants.c FAILED: vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-quants.c.o /usr/bin/cc -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -march=native -pthread -MD -MT vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-quants.c.o -MF vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-quants.c.o.d -o vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-quants.c.o -c /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/ggml-quants.c /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/ggml-quants.c: In function ‘ggml_vec_dot_iq2_xs_q8_K’: /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/ggml-quants.c:9251:42: error: implicit declaration of function ‘_mm256_set_m128i’ [-Werror=implicit-function-declaration] const __m256i full_signs_1 = _mm256_set_m128i(full_signs_l, full_signs_l); ^ /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/ggml-quants.c:9251:42: error: incompatible types when initializing type ‘__m256i {aka const __vector(4) long long int}’ using type ‘int’ /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/ggml-quants.c:9252:42: error: incompatible types when initializing type ‘__m256i {aka const __vector(4) long long int}’ using type ‘int’ const __m256i full_signs_2 = _mm256_set_m128i(full_signs_h, full_signs_h); ^ /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/ggml-quants.c: In function ‘ggml_vec_dot_iq4_nl_q8_0’: /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/ggml-quants.c:9895:31: error: incompatible types when initializing type ‘__m256i {aka const __vector(4) long long int}’ using type ‘int’ const __m256i q4b_1 = _mm256_set_m128i(_mm_shuffle_epi8(values128, _mm_and_si128(_mm_srli_epi16(q4bits_1, 4), m4b)), ^ /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/ggml-quants.c:9897:31: error: incompatible types when initializing type ‘__m256i {aka const __vector(4) long long int}’ using type ‘int’ const __m256i q4b_2 = _mm256_set_m128i(_mm_shuffle_epi8(values128, _mm_and_si128(_mm_srli_epi16(q4bits_2, 4), m4b)), ^ cc1: some warnings being treated as errors [4/22] /usr/bin/cc -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -march=native -pthread -MD -MT vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-alloc.c.o -MF vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-alloc.c.o.d -o vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-alloc.c.o -c /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/ggml-alloc.c [5/22] /usr/bin/c++ -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/. -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-array-bounds -march=native -MD -MT vendor/llama.cpp/common/CMakeFiles/common.dir/console.cpp.o -MF vendor/llama.cpp/common/CMakeFiles/common.dir/console.cpp.o.d -o vendor/llama.cpp/common/CMakeFiles/common.dir/console.cpp.o -c /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/console.cpp [6/22] /usr/bin/cc -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -march=native -pthread -MD -MT vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-backend.c.o -MF vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-backend.c.o.d -o vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-backend.c.o -c /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/ggml-backend.c [7/22] /usr/bin/c++ -DLLAMA_BUILD -DLLAMA_SHARED -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/. -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/../.. -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/../../common -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wno-cast-qual -pthread -MD -MT vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o -MF vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o.d -o vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o -c /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/llava.cpp FAILED: vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o /usr/bin/c++ -DLLAMA_BUILD -DLLAMA_SHARED -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/. -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/../.. -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/../../common -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wno-cast-qual -pthread -MD -MT vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o -MF vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o.d -o vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o -c /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/llava.cpp In file included from /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/llava.cpp:5:0: /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/../../common/base64.hpp: In static member function ‘static Output_iterator base64::encode(Input_iterator, Input_iterator, Output_iterator, base64::alphabet)’: /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/../../common/base64.hpp:84:42: error: ‘alphabet’ is not a class, namespace, or enumeration const char* alpha = alphabet == alphabet::url_filename_safe ^ /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/../../common/base64.hpp: In static member function ‘static uint8_t base64::_base64_value(base64::alphabet&, char)’: /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/../../common/base64.hpp:355:25: error: ‘alphabet’ is not a class, namespace, or enumeration if (alphabet == alphabet::standard) { ^ /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/../../common/base64.hpp:361:32: error: ‘alphabet’ is not a class, namespace, or enumeration } else if (alphabet == alphabet::url_filename_safe) { ^ /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/../../common/base64.hpp:370:28: error: ‘alphabet’ is not a class, namespace, or enumeration alphabet = alphabet::standard; ^ /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/../../common/base64.hpp:374:28: error: ‘alphabet’ is not a class, namespace, or enumeration alphabet = alphabet::standard; ^ /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/../../common/base64.hpp:378:28: error: ‘alphabet’ is not a class, namespace, or enumeration alphabet = alphabet::url_filename_safe; ^ /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/../../common/base64.hpp:382:28: error: ‘alphabet’ is not a class, namespace, or enumeration alphabet = alphabet::url_filename_safe; ^ [8/22] /usr/bin/c++ -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/. -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-array-bounds -march=native -MD -MT vendor/llama.cpp/common/CMakeFiles/common.dir/grammar-parser.cpp.o -MF vendor/llama.cpp/common/CMakeFiles/common.dir/grammar-parser.cpp.o.d -o vendor/llama.cpp/common/CMakeFiles/common.dir/grammar-parser.cpp.o -c /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/grammar-parser.cpp [9/22] /usr/bin/c++ -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/. -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/. -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/. -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/../.. -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/../../common -O3 -DNDEBUG -std=gnu++11 -MD -MT vendor/llama.cpp/examples/llava/CMakeFiles/llava-cli.dir/llava-cli.cpp.o -MF vendor/llama.cpp/examples/llava/CMakeFiles/llava-cli.dir/llava-cli.cpp.o.d -o vendor/llama.cpp/examples/llava/CMakeFiles/llava-cli.dir/llava-cli.cpp.o -c /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/llava-cli.cpp FAILED: vendor/llama.cpp/examples/llava/CMakeFiles/llava-cli.dir/llava-cli.cpp.o /usr/bin/c++ -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/. -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/. -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/. -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/../.. -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/../../common -O3 -DNDEBUG -std=gnu++11 -MD -MT vendor/llama.cpp/examples/llava/CMakeFiles/llava-cli.dir/llava-cli.cpp.o -MF vendor/llama.cpp/examples/llava/CMakeFiles/llava-cli.dir/llava-cli.cpp.o.d -o vendor/llama.cpp/examples/llava/CMakeFiles/llava-cli.dir/llava-cli.cpp.o -c /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/llava-cli.cpp In file included from /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/llava-cli.cpp:7:0: /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/./base64.hpp: In static member function ‘static Output_iterator base64::encode(Input_iterator, Input_iterator, Output_iterator, base64::alphabet)’: /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/./base64.hpp:84:42: error: ‘alphabet’ is not a class, namespace, or enumeration const char* alpha = alphabet == alphabet::url_filename_safe ^ /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/./base64.hpp: In static member function ‘static uint8_t base64::_base64_value(base64::alphabet&, char)’: /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/./base64.hpp:355:25: error: ‘alphabet’ is not a class, namespace, or enumeration if (alphabet == alphabet::standard) { ^ /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/./base64.hpp:361:32: error: ‘alphabet’ is not a class, namespace, or enumeration } else if (alphabet == alphabet::url_filename_safe) { ^ /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/./base64.hpp:370:28: error: ‘alphabet’ is not a class, namespace, or enumeration alphabet = alphabet::standard; ^ /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/./base64.hpp:374:28: error: ‘alphabet’ is not a class, namespace, or enumeration alphabet = alphabet::standard; ^ /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/./base64.hpp:378:28: error: ‘alphabet’ is not a class, namespace, or enumeration alphabet = alphabet::url_filename_safe; ^ /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/./base64.hpp:382:28: error: ‘alphabet’ is not a class, namespace, or enumeration alphabet = alphabet::url_filename_safe; ^ [10/22] /usr/bin/c++ -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/. -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-array-bounds -march=native -MD -MT vendor/llama.cpp/common/CMakeFiles/common.dir/sampling.cpp.o -MF vendor/llama.cpp/common/CMakeFiles/common.dir/sampling.cpp.o.d -o vendor/llama.cpp/common/CMakeFiles/common.dir/sampling.cpp.o -c /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/sampling.cpp [11/22] /usr/bin/c++ -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/. -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-array-bounds -march=native -MD -MT vendor/llama.cpp/common/CMakeFiles/common.dir/train.cpp.o -MF vendor/llama.cpp/common/CMakeFiles/common.dir/train.cpp.o.d -o vendor/llama.cpp/common/CMakeFiles/common.dir/train.cpp.o -c /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/train.cpp [12/22] /usr/bin/c++ -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/. -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-array-bounds -march=native -MD -MT vendor/llama.cpp/common/CMakeFiles/common.dir/common.cpp.o -MF vendor/llama.cpp/common/CMakeFiles/common.dir/common.cpp.o.d -o vendor/llama.cpp/common/CMakeFiles/common.dir/common.cpp.o -c /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/common/common.cpp [13/22] /usr/bin/c++ -DLLAMA_BUILD -DLLAMA_SHARED -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/. -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/../.. -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/../../common -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wno-cast-qual -pthread -MD -MT vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o -MF vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o.d -o vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o -c /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/examples/llava/clip.cpp [14/22] /usr/bin/cc -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -march=native -pthread -MD -MT vendor/llama.cpp/CMakeFiles/ggml.dir/ggml.c.o -MF vendor/llama.cpp/CMakeFiles/ggml.dir/ggml.c.o.d -o vendor/llama.cpp/CMakeFiles/ggml.dir/ggml.c.o -c /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/ggml.c [15/22] /usr/bin/c++ -DLLAMA_BUILD -DLLAMA_SHARED -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -Dllama_EXPORTS -I/tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-array-bounds -march=native -pthread -MD -MT vendor/llama.cpp/CMakeFiles/llama.dir/llama.cpp.o -MF vendor/llama.cpp/CMakeFiles/llama.dir/llama.cpp.o.d -o vendor/llama.cpp/CMakeFiles/llama.dir/llama.cpp.o -c /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/llama.cpp /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/llama.cpp: In member function ‘bool llama_model_loader::get_key(llm_kv, T&, bool) [with T = llama_pooling_type]’: /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/llama.cpp:2890:43: warning: ‘tmp’ may be used uninitialized in this function [-Wmaybe-uninitialized] result = (enum llama_pooling_type) tmp; ^ /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/llama.cpp: In function ‘void llama_model_quantize_internal(const string&, const string&, const llama_model_quantize_params*)’: /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16/vendor/llama.cpp/llama.cpp:11183:27: warning: ‘new_type’ may be used uninitialized in this function [-Wmaybe-uninitialized] new_type == GGML_TYPE_IQ1_S || ^ ninja: build stopped: subcommand failed.

*** CMake build failed error: subprocess-exited-with-error

× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip. full command: /home/zhoushengyang/anaconda3/envs/rllm-2/bin/python3.8 /home/zhoushengyang/anaconda3/envs/rllm-2/lib/python3.8/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py build_wheel /tmp/tmptpfrfuuc cwd: /tmp/pip-install-mlxre0ob/llama-cpp-python_84a4a8ea3dde44d8b739c814605e3b16 Building wheel for llama-cpp-python (pyproject.toml) ... error ERROR: Failed building wheel for llama-cpp-python Failed to build llama-cpp-python ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects

Arrebol-logos avatar Feb 26 '24 12:02 Arrebol-logos

In my case help downgrade Python to 3.11.8 with complete removal and reinstallation of libraries

techn0man1ac avatar Aug 07 '24 13:08 techn0man1ac

gcc 版本需要升级到 11 版本,升级好就可以正常安装了

How to upgrade gcc version on ubuntu 18.04?

muzamil47 avatar Aug 21 '24 06:08 muzamil47