DB-GPT
DB-GPT copied to clipboard
[Bug] [Install] Failed building wheel for llama-cpp-python
Search before asking
- [X] I had searched in the issues and found no similar issues.
Operating system information
MacOS(M1, M2...)
Python version information
=3.11
DB-GPT version
main
Related scenes
- [ ] Chat Data
- [ ] Chat Excel
- [ ] Chat DB
- [ ] Chat Knowledge
- [ ] Model Management
- [ ] Dashboard
- [ ] Plugins
Installation Information
-
[ ] AutoDL Image
-
[ ] Other
Device information
Device: Apple M2 Pro 32 GB
Models information
LLM: vicuna-13b-v1.5, ggml-model-q4_0.gguf Embedding model: text2vec-large-chinese
What happened
Failed building wheel for llama-cpp-python
What you expected to happen
ninja: build stopped: subcommand failed.
*** CMake build failed
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for llama-cpp-python Failed to build llama-cpp-python ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects
How to reproduce
` ❯ ls CODE_OF_CONDUCT Makefile assets docker logs plugins tests CONTRIBUTING.md README.md cd docker-compose.yml models requirements tools LICENSE README.zh.md dbgpt docs package-lock.json scripts web MANIFEST.in and dbgpt.egg-info examples pilot setup.py ❯ pip install -e ".[llama_cpp]" Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple Obtaining file:///Users/xuji/Project/PythonProject/DB-GPT Preparing metadata (setup.py) ... done Requirement already satisfied: aiohttp==3.8.4 in /Users/xuji/anaconda3/lib/python3.11/site-packages (from dbgpt==0.5.1) (3.8.4) Collecting chardet==5.1.0 (from dbgpt==0.5.1) Downloading https://pypi.tuna.tsinghua.edu.cn/packages/74/8f/8fc49109009e8d2169d94d72e6b1f4cd45c13d147ba7d6170fb41f22b08f/chardet-5.1.0-py3-none-any.whl (199 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 199.1/199.1 kB 587.2 kB/s eta 0:00:00 Collecting importlib-resources==5.12.0 (from dbgpt==0.5.1) Downloading https://pypi.tuna.tsinghua.edu.cn/packages/38/71/c13ea695a4393639830bf96baea956538ba7a9d06fcce7cef10bfff20f72/importlib_resources-5.12.0-py3-none-any.whl (36 kB) Requirement already satisfied: python-dotenv==1.0.0 in /Users/xuji/anaconda3/lib/python3.11/site-packages (from dbgpt==0.5.1) (1.0.0) Requirement already satisfied: cachetools in /Users/xuji/anaconda3/lib/python3.11/site-packages (from dbgpt==0.5.1) (5.3.2) Collecting pydantic<2,>=1 (from dbgpt==0.5.1) Downloading https://pypi.tuna.tsinghua.edu.cn/packages/4b/44/439860148466c6a541a2916fc379a5730b16ef3c7d433e30a6041d36d7bb/pydantic-1.10.14-cp311-cp311-macosx_11_0_arm64.whl (2.5 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.5/2.5 MB 388.4 kB/s eta 0:00:00 Collecting typeguard (from dbgpt==0.5.1) Downloading https://pypi.tuna.tsinghua.edu.cn/packages/18/01/5fc45558268ced46d86292763477996a3cdd505567cd590a688e8cdc386e/typeguard-4.1.5-py3-none-any.whl (34 kB) Collecting llama-cpp-python (from dbgpt==0.5.1) Downloading https://pypi.tuna.tsinghua.edu.cn/packages/8e/ae/551f28037d9a49693f7b09b0e22912be4e839b1af5f4ae6ab721162a37a4/llama_cpp_python-0.2.57.tar.gz (36.9 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 36.9/36.9 MB 184.7 kB/s eta 0:00:00 Installing build dependencies ... done Getting requirements to build wheel ... done Installing backend dependencies ... done Preparing metadata (pyproject.toml) ... done Requirement already satisfied: attrs>=17.3.0 in /Users/xuji/anaconda3/lib/python3.11/site-packages (from aiohttp==3.8.4->dbgpt==0.5.1) (23.1.0) Requirement already satisfied: charset-normalizer<4.0,>=2.0 in /Users/xuji/anaconda3/lib/python3.11/site-packages (from aiohttp==3.8.4->dbgpt==0.5.1) (3.1.0) Requirement already satisfied: multidict<7.0,>=4.5 in /Users/xuji/anaconda3/lib/python3.11/site-packages (from aiohttp==3.8.4->dbgpt==0.5.1) (6.0.4) Requirement already satisfied: async-timeout<5.0,>=4.0.0a3 in /Users/xuji/anaconda3/lib/python3.11/site-packages (from aiohttp==3.8.4->dbgpt==0.5.1) (4.0.2) Requirement already satisfied: yarl<2.0,>=1.0 in /Users/xuji/anaconda3/lib/python3.11/site-packages (from aiohttp==3.8.4->dbgpt==0.5.1) (1.9.2) Requirement already satisfied: frozenlist>=1.1.1 in /Users/xuji/anaconda3/lib/python3.11/site-packages (from aiohttp==3.8.4->dbgpt==0.5.1) (1.3.3) Requirement already satisfied: aiosignal>=1.1.2 in /Users/xuji/anaconda3/lib/python3.11/site-packages (from aiohttp==3.8.4->dbgpt==0.5.1) (1.3.1) Requirement already satisfied: typing-extensions>=4.2.0 in /Users/xuji/anaconda3/lib/python3.11/site-packages (from pydantic<2,>=1->dbgpt==0.5.1) (4.8.0) Requirement already satisfied: numpy>=1.20.0 in /Users/xuji/anaconda3/lib/python3.11/site-packages (from llama-cpp-python->dbgpt==0.5.1) (1.24.3) Collecting diskcache>=5.6.1 (from llama-cpp-python->dbgpt==0.5.1) Downloading https://pypi.tuna.tsinghua.edu.cn/packages/3f/27/4570e78fc0bf5ea0ca45eb1de3818a23787af9b390c0b0a0033a1b8236f9/diskcache-5.6.3-py3-none-any.whl (45 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 45.5/45.5 kB 219.9 kB/s eta 0:00:00 Requirement already satisfied: jinja2>=2.11.3 in /Users/xuji/anaconda3/lib/python3.11/site-packages (from llama-cpp-python->dbgpt==0.5.1) (3.1.2) Requirement already satisfied: MarkupSafe>=2.0 in /Users/xuji/anaconda3/lib/python3.11/site-packages (from jinja2>=2.11.3->llama-cpp-python->dbgpt==0.5.1) (2.1.3) Requirement already satisfied: idna>=2.0 in /Users/xuji/anaconda3/lib/python3.11/site-packages (from yarl<2.0,>=1.0->aiohttp==3.8.4->dbgpt==0.5.1) (3.4) Building wheels for collected packages: llama-cpp-python Building wheel for llama-cpp-python (pyproject.toml) ... error error: subprocess-exited-with-error
× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [66 lines of output] *** scikit-build-core 0.8.2 using CMake 3.28.3 (wheel) *** Configuring CMake... 2024-03-20 11:48:14,473 - scikit_build_core - WARNING - libdir/ldlibrary: /Users/xuji/anaconda3/lib/libpython3.11.a is not a real file! 2024-03-20 11:48:14,473 - scikit_build_core - WARNING - Can't find a Python library, got libdir=/Users/xuji/anaconda3/lib, ldlibrary=libpython3.11.a, multiarch=darwin, masd=None loading initial cache file /var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/tmp8r4sq7lr/build/CMakeInit.txt -- The C compiler identification is AppleClang 15.0.0.15000309 -- The CXX compiler identification is AppleClang 15.0.0.15000309 -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working C compiler: /Library/Developer/CommandLineTools/usr/bin/cc - skipped -- Detecting C compile features -- Detecting C compile features - done -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: /Library/Developer/CommandLineTools/usr/bin/c++ - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Found Git: /usr/bin/git (found version "2.39.3 (Apple Git-146)") -- Performing Test CMAKE_HAVE_LIBC_PTHREAD -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success -- Found Threads: TRUE -- Accelerate framework found -- Metal framework found -- Warning: ccache not found - consider installing it for faster compilation or disable this warning with LLAMA_CCACHE=OFF -- CMAKE_SYSTEM_PROCESSOR: arm64 -- ARM detected -- Performing Test COMPILER_SUPPORTS_FP16_FORMAT_I3E -- Performing Test COMPILER_SUPPORTS_FP16_FORMAT_I3E - Failed CMake Warning (dev) at vendor/llama.cpp/CMakeLists.txt:1218 (install): Target llama has RESOURCE files but no RESOURCE DESTINATION. This warning is for project developers. Use -Wno-dev to suppress it.
CMake Warning (dev) at CMakeLists.txt:21 (install):
Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
This warning is for project developers. Use -Wno-dev to suppress it.
CMake Warning (dev) at CMakeLists.txt:30 (install):
Target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
This warning is for project developers. Use -Wno-dev to suppress it.
-- Configuring done (0.7s)
-- Generating done (0.0s)
-- Build files have been written to: /var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/tmp8r4sq7lr/build
*** Building project with Ninja...
Change Dir: '/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/tmp8r4sq7lr/build'
Run Build Command(s): /Users/xuji/anaconda3/bin/ninja -v
[1/25] cd /var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/tmp8r4sq7lr/build/vendor/llama.cpp && xcrun -sdk macosx metal -O3 -c /var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/tmp8r4sq7lr/build/bin/ggml-metal.metal -o /var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/tmp8r4sq7lr/build/bin/ggml-metal.air && xcrun -sdk macosx metallib /var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/tmp8r4sq7lr/build/bin/ggml-metal.air -o /var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/tmp8r4sq7lr/build/bin/default.metallib && rm -f /var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/tmp8r4sq7lr/build/bin/ggml-metal.air && rm -f /var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/tmp8r4sq7lr/build/bin/ggml-common.h && rm -f /var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/tmp8r4sq7lr/build/bin/ggml-metal.metal
FAILED: bin/default.metallib /var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/tmp8r4sq7lr/build/bin/default.metallib
cd /var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/tmp8r4sq7lr/build/vendor/llama.cpp && xcrun -sdk macosx metal -O3 -c /var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/tmp8r4sq7lr/build/bin/ggml-metal.metal -o /var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/tmp8r4sq7lr/build/bin/ggml-metal.air && xcrun -sdk macosx metallib /var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/tmp8r4sq7lr/build/bin/ggml-metal.air -o /var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/tmp8r4sq7lr/build/bin/default.metallib && rm -f /var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/tmp8r4sq7lr/build/bin/ggml-metal.air && rm -f /var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/tmp8r4sq7lr/build/bin/ggml-common.h && rm -f /var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/tmp8r4sq7lr/build/bin/ggml-metal.metal
xcrun: error: unable to find utility "metal", not a developer tool or in PATH
[2/25] cd /private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp && /private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-build-env-5oxmtbv_/normal/lib/python3.11/site-packages/cmake/data/bin/cmake -DMSVC= -DCMAKE_C_COMPILER_VERSION=15.0.0.15000309 -DCMAKE_C_COMPILER_ID=AppleClang -DCMAKE_VS_PLATFORM_NAME= -DCMAKE_C_COMPILER=/Library/Developer/CommandLineTools/usr/bin/cc -P /private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/common/../scripts/gen-build-info-cpp.cmake
-- Found Git: /usr/bin/git (found version "2.39.3 (Apple Git-146)")
[3/25] /Library/Developer/CommandLineTools/usr/bin/cc -DACCELERATE_LAPACK_ILP64 -DACCELERATE_NEW_LAPACK -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_ACCELERATE -DGGML_USE_METAL -D_DARWIN_C_SOURCE -D_XOPEN_SOURCE=600 -I/private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/. -F/Library/Developer/CommandLineTools/SDKs/MacOSX14.4.sdk/System/Library/Frameworks -O3 -DNDEBUG -std=gnu11 -arch arm64 -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX14.4.sdk -mmacosx-version-min=14.3 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wdouble-promotion -MD -MT vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-alloc.c.o -MF vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-alloc.c.o.d -o vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-alloc.c.o -c /private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/ggml-alloc.c
[4/25] /Library/Developer/CommandLineTools/usr/bin/cc -DACCELERATE_LAPACK_ILP64 -DACCELERATE_NEW_LAPACK -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_ACCELERATE -DGGML_USE_METAL -D_DARWIN_C_SOURCE -D_XOPEN_SOURCE=600 -I/private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/. -F/Library/Developer/CommandLineTools/SDKs/MacOSX14.4.sdk/System/Library/Frameworks -O3 -DNDEBUG -std=gnu11 -arch arm64 -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX14.4.sdk -mmacosx-version-min=14.3 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wdouble-promotion -MD -MT vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-backend.c.o -MF vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-backend.c.o.d -o vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-backend.c.o -c /private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/ggml-backend.c
[5/25] /Library/Developer/CommandLineTools/usr/bin/c++ -DGGML_USE_METAL -DLLAMA_BUILD -DLLAMA_SHARED -I/private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/examples/llava/. -I/private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/examples/llava/../.. -I/private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/examples/llava/../../common -I/private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -arch arm64 -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX14.4.sdk -mmacosx-version-min=14.3 -fPIC -Wno-cast-qual -MD -MT vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o -MF vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o.d -o vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/llava.cpp.o -c /private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/examples/llava/llava.cpp
[6/25] /Library/Developer/CommandLineTools/usr/bin/cc -DACCELERATE_LAPACK_ILP64 -DACCELERATE_NEW_LAPACK -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_ACCELERATE -DGGML_USE_METAL -D_DARWIN_C_SOURCE -D_XOPEN_SOURCE=600 -I/private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/. -F/Library/Developer/CommandLineTools/SDKs/MacOSX14.4.sdk/System/Library/Frameworks -O3 -DNDEBUG -std=gnu11 -arch arm64 -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX14.4.sdk -mmacosx-version-min=14.3 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wdouble-promotion -MD -MT vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-metal.m.o -MF vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-metal.m.o.d -o vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-metal.m.o -c /private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/ggml-metal.m
[7/25] /Library/Developer/CommandLineTools/usr/bin/cc -DACCELERATE_LAPACK_ILP64 -DACCELERATE_NEW_LAPACK -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_ACCELERATE -DGGML_USE_METAL -D_DARWIN_C_SOURCE -D_XOPEN_SOURCE=600 -I/private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/. -F/Library/Developer/CommandLineTools/SDKs/MacOSX14.4.sdk/System/Library/Frameworks -O3 -DNDEBUG -std=gnu11 -arch arm64 -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX14.4.sdk -mmacosx-version-min=14.3 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wdouble-promotion -MD -MT vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-quants.c.o -MF vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-quants.c.o.d -o vendor/llama.cpp/CMakeFiles/ggml.dir/ggml-quants.c.o -c /private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/ggml-quants.c
[8/25] /Library/Developer/CommandLineTools/usr/bin/c++ -DACCELERATE_LAPACK_ILP64 -DACCELERATE_NEW_LAPACK -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_ACCELERATE -DGGML_USE_METAL -DLLAMA_BUILD -DLLAMA_SHARED -D_DARWIN_C_SOURCE -D_XOPEN_SOURCE=600 -Dllama_EXPORTS -I/private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/. -F/Library/Developer/CommandLineTools/SDKs/MacOSX14.4.sdk/System/Library/Frameworks -O3 -DNDEBUG -std=gnu++11 -arch arm64 -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX14.4.sdk -mmacosx-version-min=14.3 -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wmissing-prototypes -Wextra-semi -MD -MT vendor/llama.cpp/CMakeFiles/llama.dir/unicode.cpp.o -MF vendor/llama.cpp/CMakeFiles/llama.dir/unicode.cpp.o.d -o vendor/llama.cpp/CMakeFiles/llama.dir/unicode.cpp.o -c /private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/unicode.cpp
[9/25] /Library/Developer/CommandLineTools/usr/bin/c++ -DGGML_USE_METAL -DLLAMA_BUILD -DLLAMA_SHARED -I/private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/examples/llava/. -I/private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/examples/llava/../.. -I/private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/examples/llava/../../common -I/private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -arch arm64 -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX14.4.sdk -mmacosx-version-min=14.3 -fPIC -Wno-cast-qual -MD -MT vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o -MF vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o.d -o vendor/llama.cpp/examples/llava/CMakeFiles/llava.dir/clip.cpp.o -c /private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/examples/llava/clip.cpp
[10/25] /Library/Developer/CommandLineTools/usr/bin/cc -DACCELERATE_LAPACK_ILP64 -DACCELERATE_NEW_LAPACK -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_ACCELERATE -DGGML_USE_METAL -D_DARWIN_C_SOURCE -D_XOPEN_SOURCE=600 -I/private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/. -F/Library/Developer/CommandLineTools/SDKs/MacOSX14.4.sdk/System/Library/Frameworks -O3 -DNDEBUG -std=gnu11 -arch arm64 -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX14.4.sdk -mmacosx-version-min=14.3 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wdouble-promotion -MD -MT vendor/llama.cpp/CMakeFiles/ggml.dir/ggml.c.o -MF vendor/llama.cpp/CMakeFiles/ggml.dir/ggml.c.o.d -o vendor/llama.cpp/CMakeFiles/ggml.dir/ggml.c.o -c /private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/ggml.c
[11/25] /Library/Developer/CommandLineTools/usr/bin/c++ -DACCELERATE_LAPACK_ILP64 -DACCELERATE_NEW_LAPACK -DGGML_SCHED_MAX_COPIES=4 -DGGML_USE_ACCELERATE -DGGML_USE_METAL -DLLAMA_BUILD -DLLAMA_SHARED -D_DARWIN_C_SOURCE -D_XOPEN_SOURCE=600 -Dllama_EXPORTS -I/private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/. -F/Library/Developer/CommandLineTools/SDKs/MacOSX14.4.sdk/System/Library/Frameworks -O3 -DNDEBUG -std=gnu++11 -arch arm64 -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX14.4.sdk -mmacosx-version-min=14.3 -fPIC -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wmissing-prototypes -Wextra-semi -MD -MT vendor/llama.cpp/CMakeFiles/llama.dir/llama.cpp.o -MF vendor/llama.cpp/CMakeFiles/llama.dir/llama.cpp.o.d -o vendor/llama.cpp/CMakeFiles/llama.dir/llama.cpp.o -c /private/var/folders/4s/yy57yhc576xfdy54ydh4y89m0000gn/T/pip-install-i9af0hni/llama-cpp-python_8e9103ef918f44cebc95bc60aa935a2b/vendor/llama.cpp/llama.cpp
ninja: build stopped: subcommand failed.
*** CMake build failed
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for llama-cpp-python Failed to build llama-cpp-python ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects ❯ pip --version pip 23.2.1 from /Users/xuji/anaconda3/lib/python3.11/site-packages/pip (python 3.11) ❯ python --version Python 3.11.4 `
Additional context
No response
Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
Got the same issue
This issue has been marked as stale
, because it has been over 30 days without any activity.
This issue bas been closed, because it has been marked as stale
and there has been no activity for over 7 days.