private-gpt icon indicating copy to clipboard operation
private-gpt copied to clipboard

Unable to complete the step "requirements.txt" due to errors received

Open pemaldonado1967 opened this issue 2 years ago • 15 comments

Describe the bug and how to reproduce it Using Visual Studio 2022 On Terminal run: "pip install -r requirements.txt" After a few seconds of run this message appears: "Building wheels for collected packages: llama-cpp-python, hnswlib Building wheel for llama-cpp-python (pyproject.toml) ... done Created wheel for llama-cpp-python: filename=llama_cpp_python-0.1.48-cp311-cp311-win32.whl size=181594 sha256=510833eef901d4c6b755dedb8a8792b634601c15d5b7f2891502b4efd1d2db28 Stored in directory: c:\users\pemal\appdata\local\pip\cache\wheels\dd\50\e2\8ae28fe2d12429f4fa450e59d23cabf138155cf4bc5387769c Building wheel for hnswlib (pyproject.toml) ... error error: subprocess-exited-with-error

× Building wheel for hnswlib (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [10 lines of output] running bdist_wheel running build running build_ext building 'hnswlib' extension creating build creating build\temp.win32-cpython-311 creating build\temp.win32-cpython-311\Release creating build\temp.win32-cpython-311\Release\python_bindings cl.exe /c /nologo /O2 /W3 /GL /DNDEBUG /MD -IC:\Users\pemal\AppData\Local\Temp\pip-build-env-m53bvwqs\overlay\Lib\site-packages\pybind11\include -IC:\Users\pemal\AppData\Local\Temp\pip-build-env-m53bvwqs\overlay\Lib\site-packages\numpy\core\include -I./hnswlib/ -IC:\Users\pemal\AppData\Local\Programs\Python\Python311\include -IC:\Users\pemal\AppData\Local\Programs\Python\Python311\Include "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\ucrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\shared" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\winrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\cppwinrt" "-IC:\Program Files (x86)\Windows Kits\NETFXSDK\4.8\include\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\ucrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\shared" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\winrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\cppwinrt" "-IC:\Program Files (x86)\Windows Kits\NETFXSDK\4.8\include\um" /EHsc /Tp./python_bindings/bindings.cpp /Fobuild\temp.win32-cpython-311\Release./python_bindings/bindings.obj /EHsc /openmp /O2 /DVERSION_INFO=\"0.7.0\" error: command 'cl.exe' failed: None [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for hnswlib Successfully built llama-cpp-python Failed to build hnswlib ERROR: Could not build wheels for hnswlib, which is required to install pyproject.toml-based projects

Expected behavior requirements should be completed successfully.

Environment (please complete the following information):

  • OS / hardware: [Windows 10 / Intel i7 processor]
  • Python version [e.g. 3.11 64-bit]

Additional context Installed components: Visual C++ compilers and libraries C++ Universal Windows Platform C++ CMake tools for Windows etc.

pemaldonado1967 avatar May 17 '23 13:05 pemaldonado1967

Did you also do 3.Download the MinGW installer from the MinGW website. 4.Run the installer and select the gcc component. Steps number 3,4 ? Restart afterwards,yes?

NPap0 avatar May 17 '23 13:05 NPap0

Hi, update. I did steps 3,4 + restart. Run again pip install -r requirements.txt then: "Building wheels for collected packages: llama-cpp-python, hnswlib Building wheel for llama-cpp-python (pyproject.toml) ... done Created wheel for llama-cpp-python: filename=llama_cpp_python-0.1.48-cp311-cp311-win32.whl size=181594 sha256=01680bb7f525f5e64453373f1859d84a2951ff531cb0553000946e0e6b65f249 Stored in directory: c:\users\pemal\appdata\local\pip\cache\wheels\dd\50\e2\8ae28fe2d12429f4fa450e59d23cabf138155cf4bc5387769c Building wheel for hnswlib (pyproject.toml) ... error error: subprocess-exited-with-error

× Building wheel for hnswlib (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [10 lines of output] running bdist_wheel running build running build_ext building 'hnswlib' extension creating build creating build\temp.win32-cpython-311 creating build\temp.win32-cpython-311\Release creating build\temp.win32-cpython-311\Release\python_bindings cl.exe /c /nologo /O2 /W3 /GL /DNDEBUG /MD -IC:\Users\pemal\AppData\Local\Temp\pip-build-env-wyet2tde\overlay\Lib\site-packages\pybind11\include -IC:\Users\pemal\AppData\Local\Temp\pip-build-env-wyet2tde\overlay\Lib\site-packages\numpy\core\include -I./hnswlib/ -IC:\Users\pemal\AppData\Local\Programs\Python\Python311\include -IC:\Users\pemal\AppData\Local\Programs\Python\Python311\Include "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\ucrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\shared" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\winrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\cppwinrt" "-IC:\Program Files (x86)\Windows Kits\NETFXSDK\4.8\include\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\ucrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\shared" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\winrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\cppwinrt" "-IC:\Program Files (x86)\Windows Kits\NETFXSDK\4.8\include\um" /EHsc /Tp./python_bindings/bindings.cpp /Fobuild\temp.win32-cpython-311\Release./python_bindings/bindings.obj /EHsc /openmp /O2 /DVERSION_INFO=\"0.7.0\" error: command 'cl.exe' failed: None [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for hnswlib Successfully built llama-cpp-python Failed to build hnswlib ERROR: Could not build wheels for hnswlib, which is required to install pyproject.toml-based projects"

Thanks for your advise.

pemaldonado1967 avatar May 17 '23 13:05 pemaldonado1967

cl.exe is Microsoft C++ compiler. It is required to compile C++ projects from scratch. The other option would be using MinGW.

For Visual C++, refer to this: https://stackoverflow.com/a/41724634/6416513

maozdemir avatar May 17 '23 14:05 maozdemir

Update, after ensuring cl.exe exist, PATH variable udpated, installing MinGW then this error appeared: "Building wheels for collected packages: llama-cpp-python, hnswlib Building wheel for llama-cpp-python (pyproject.toml) ... done Created wheel for llama-cpp-python: filename=llama_cpp_python-0.1.48-cp311-cp311-win32.whl size=181594 sha256=2d2a342b2fc3110ffbc14c11e732eb3c86118add972bd4b5e09c24ec95605f9d Stored in directory: c:\users\pemal\appdata\local\pip\cache\wheels\dd\50\e2\8ae28fe2d12429f4fa450e59d23cabf138155cf4bc5387769c Building wheel for hnswlib (pyproject.toml) ... error error: subprocess-exited-with-error

× Building wheel for hnswlib (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [12 lines of output] running bdist_wheel running build running build_ext building 'hnswlib' extension creating build creating build\temp.win32-cpython-311 creating build\temp.win32-cpython-311\Release creating build\temp.win32-cpython-311\Release\python_bindings "C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\bin\amd64_arm\cl.exe" /c /nologo /O2 /W3 /GL /DNDEBUG /MD -IC:\Users\pemal\AppData\Local\Temp\pip-build-env-bwur3o53\overlay\Lib\site-packages\pybind11\include -IC:\Users\pemal\AppData\Local\Temp\pip-build-env-bwur3o53\overlay\Lib\site-packages\numpy\core\include -I./hnswlib/ -IC:\Users\pemal\AppData\Local\Programs\Python\Python311\include -IC:\Users\pemal\AppData\Local\Programs\Python\Python311\Include "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\ucrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\shared" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\winrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\cppwinrt" "-IC:\Program Files (x86)\Windows Kits\NETFXSDK\4.8\include\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\ucrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\shared" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\winrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.22621.0\cppwinrt" "-IC:\Program Files (x86)\Windows Kits\NETFXSDK\4.8\include\um" /EHsc /Tp./python_bindings/bindings.cpp /Fobuild\temp.win32-cpython-311\Release./python_bindings/bindings.obj /EHsc /openmp /O2 /DVERSION_INFO=\"0.7.0\" bindings.cpp ./python_bindings/bindings.cpp(1): fatal error C1083: Cannot open include file: 'iostream': No such file or directory error: command 'C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\bin\amd64_arm\cl.exe' failed with exit code 2 [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for hnswlib Successfully built llama-cpp-python Failed to build hnswlib ERROR: Could not build wheels for hnswlib, which is required to install pyproject.toml-based projects"

Thanks for your continued advise. Pablo

pemaldonado1967 avatar May 17 '23 14:05 pemaldonado1967

Can you try this inside powershell?

git clone https://github.com/abetlen/llama-cpp-python
cd llama-cpp-python/vendor
git clone https://github.com/ggerganov/llama.cpp
cd ..
pip3 uninstall llama-cpp-python
pip3 install scikit-build
$Env:CMAKE_ARGS="-DLLAMA_CUBLAS=on"; $Env:FORCE_CMAKE=1; python3 ./setup.py install

maozdemir avatar May 17 '23 15:05 maozdemir

Hi, I did it but it fails (disregard the "already exists" as I did it twice to be sure: "PS C:\Users\pemal\source\repos\GPT\privateGPT> git clone https://github.com/abetlen/llama-cpp-python fatal: destination path 'llama-cpp-python' already exists and is not an empty directory. PS C:\Users\pemal\source\repos\GPT\privateGPT> cd llama-cpp-python/vendor PS C:\Users\pemal\source\repos\GPT\privateGPT\llama-cpp-python\vendor> git clone https://github.com/ggerganov/llama.cpp fatal: destination path 'llama.cpp' already exists and is not an empty directory. PS C:\Users\pemal\source\repos\GPT\privateGPT\llama-cpp-python\vendor> cd.. PS C:\Users\pemal\source\repos\GPT\privateGPT\llama-cpp-python> pip3 uninstall llama-cpp-python WARNING: Skipping llama-cpp-python as it is not installed. PS C:\Users\pemal\source\repos\GPT\privateGPT\llama-cpp-python> pip3 install scikit-build

$Env:CMAKE_ARGS="-DLLAMA_CUBLAS=on"; $Env:FORCE_CMAKE=1; python3 ./setup.py install Requirement already satisfied: scikit-build in c:\users\pemal\appdata\local\programs\python\python311\lib\site-packages (0.17.5) Requirement already satisfied: distro in c:\users\pemal\appdata\local\programs\python\python311\lib\site-packages (from scikit-build) (1.8.0) Requirement already satisfied: packaging in c:\users\pemal\appdata\local\programs\python\python311\lib\site-packages (from scikit-build) (23.1) Requirement already satisfied: setuptools>=42.0.0 in c:\users\pemal\appdata\local\programs\python\python311\lib\site-packages (from scikit-build) (67.7.2) Requirement already satisfied: wheel>=0.32.0 in c:\users\pemal\appdata\local\programs\python\python311\lib\site-packages (from scikit-build) (0.40.0) Python was not found; run without arguments to install from the Microsoft Store, or disable this shortcut from Settings > Manage App Execution Aliases."

Thanks again. Pablo

pemaldonado1967 avatar May 17 '23 15:05 pemaldonado1967

Hi, I did it but it fails (disregard the "already exists" as I did it twice to be sure: "PS C:\Users\pemal\source\repos\GPT\privateGPT> git clone https://github.com/abetlen/llama-cpp-python fatal: destination path 'llama-cpp-python' already exists and is not an empty directory. PS C:\Users\pemal\source\repos\GPT\privateGPT> cd llama-cpp-python/vendor PS C:\Users\pemal\source\repos\GPT\privateGPT\llama-cpp-python\vendor> git clone https://github.com/ggerganov/llama.cpp fatal: destination path 'llama.cpp' already exists and is not an empty directory. PS C:\Users\pemal\source\repos\GPT\privateGPT\llama-cpp-python\vendor> cd.. PS C:\Users\pemal\source\repos\GPT\privateGPT\llama-cpp-python> pip3 uninstall llama-cpp-python WARNING: Skipping llama-cpp-python as it is not installed. PS C:\Users\pemal\source\repos\GPT\privateGPT\llama-cpp-python> pip3 install scikit-build

$Env:CMAKE_ARGS="-DLLAMA_CUBLAS=on"; $Env:FORCE_CMAKE=1; python3 ./setup.py install Requirement already satisfied: scikit-build in c:\users\pemal\appdata\local\programs\python\python311\lib\site-packages (0.17.5) Requirement already satisfied: distro in c:\users\pemal\appdata\local\programs\python\python311\lib\site-packages (from scikit-build) (1.8.0) Requirement already satisfied: packaging in c:\users\pemal\appdata\local\programs\python\python311\lib\site-packages (from scikit-build) (23.1) Requirement already satisfied: setuptools>=42.0.0 in c:\users\pemal\appdata\local\programs\python\python311\lib\site-packages (from scikit-build) (67.7.2) Requirement already satisfied: wheel>=0.32.0 in c:\users\pemal\appdata\local\programs\python\python311\lib\site-packages (from scikit-build) (0.40.0) Python was not found; run without arguments to install from the Microsoft Store, or disable this shortcut from Settings > Manage App Execution Aliases."

Thanks again. Pablo

Sorry, my mistake. The python call is incorrect for windows environment. This should do.

git clone https://github.com/abetlen/llama-cpp-python
cd llama-cpp-python/vendor
git clone https://github.com/ggerganov/llama.cpp
cd ..
pip3 uninstall llama-cpp-python
pip3 install scikit-build
$Env:CMAKE_ARGS="-DLLAMA_CUBLAS=on"; $Env:FORCE_CMAKE=1; py -3.11 ./setup.py install

maozdemir avatar May 17 '23 15:05 maozdemir

Hi again. I will stop working on this for now as there is no way to make this work. I have seen others posting messages on youtube to spend over 12 hours without success. Thanks anyway. Pablo

pemaldonado1967 avatar May 17 '23 15:05 pemaldonado1967

Did you try with virtual environment?

psegovias avatar May 17 '23 17:05 psegovias

Hi, update. I tried with visual studio code instead of visual studio 2022 and it worked. Now testing it. Thank you! Pablo

pemaldonado1967 avatar May 17 '23 17:05 pemaldonado1967

Update during testing. How do solve this: "gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' ggml_new_tensor_impl: not enough space in the context's memory pool (needed 26435840368, available 26355938400)" I ingested just one csv file of 57KB in size.

Thanks,

Pablo

pemaldonado1967 avatar May 17 '23 18:05 pemaldonado1967

This fixed it for me.

Incorrect or incomplete installation of MSVC: Ensure you have MSVC properly installed with the correct toolset. In Visual Studio Installer, there is a section for individual components where you can make sure the necessary MSVC v142 - VS 2019 C++ x64/x86 build tools are installed.

robstoof avatar May 18 '23 20:05 robstoof

This fixed it for me.

Incorrect or incomplete installation of MSVC: Ensure you have MSVC properly installed with the correct toolset. In Visual Studio Installer, there is a section for individual components where you can make sure the necessary MSVC v142 - VS 2019 C++ x64/x86 build tools are installed.

Did you also have slow output? Did that get fixed too or are you just not getting the uknown token warnings anymore?

NPap0 avatar May 18 '23 21:05 NPap0

Hi, update. I tried with visual studio code instead of visual studio 2022 and it worked. Now testing it. Thank you! Pablo

It is what also worked for me.

OskarKaminski avatar May 18 '23 22:05 OskarKaminski

I'm getting compilation errors (also while building the wheel). Tried installing visual studio and mingw but no success:

  [1/4] Building C object vendor\llama.cpp\CMakeFiles\ggml.dir\ggml.c.obj
      FAILED: vendor/llama.cpp/CMakeFiles/ggml.dir/ggml.c.obj
      C:\PROGRA~2\MICROS~2\2017\BUILDT~1\VC\Tools\MSVC\1416~1.270\bin\Hostx86\x64\cl.exe  /nologo -D_CRT_SECURE_NO_WARNINGS -IC:\Users\bme\AppData\Local\Temp\pip-install-zt5anoyk\llama-cpp-python_d5edc87ec79f4c48b6d9b2076352bfb1\vendor\llama.cpp\. /DWIN32 /D_WINDOWS /O2 /Ob2 /DNDEBUG -MD /arch:AVX2 /showIncludes /Fovendor\llama.cpp\CMakeFiles\ggml.dir\ggml.c.obj /Fdvendor\llama.cpp\CMakeFiles\ggml.dir\ /FS -c C:\Users\bme\AppData\Local\Temp\pip-install-zt5anoyk\llama-cpp-python_d5edc87ec79f4c48b6d9b2076352bfb1\vendor\llama.cpp\ggml.c
      c:\users\bme\appdata\local\temp\pip-install-zt5anoyk\llama-cpp-python_d5edc87ec79f4c48b6d9b2076352bfb1\vendor\llama.cpp\ggml.h(918): error C2146: syntax error: missing ')' before identifier 'x'
      c:\users\bme\appdata\local\temp\pip-install-zt5anoyk\llama-cpp-python_d5edc87ec79f4c48b6d9b2076352bfb1\vendor\llama.cpp\ggml.h(918): error C2061: syntax error: identifier 'x'

and about a hundred more similar compilation failures

mrbrianevans avatar May 19 '23 17:05 mrbrianevans