llama.cpp icon indicating copy to clipboard operation
llama.cpp copied to clipboard

Can not install llama.cpp with cuBlas using the latest code.

Open IvoryTower800 opened this issue 1 year ago • 3 comments

I ran the code "CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install llama-cpp-python" on Kaggle 2xT4 envrionment. It worked before. but today. When I ran the same code. below error occured. Could you please tell me what should I do to install it?

Collecting llama-cpp-python==0.2.29 Using cached llama_cpp_python-0.2.29.tar.gz (9.5 MB) Installing build dependencies ... done Getting requirements to build wheel ... done Installing backend dependencies ... done Preparing metadata (pyproject.toml) ... done Requirement already satisfied: typing-extensions>=4.5.0 in /opt/conda/lib/python3.10/site-packages (from llama-cpp-python==0.2.29) (4.5.0) Requirement already satisfied: numpy>=1.20.0 in /opt/conda/lib/python3.10/site-packages (from llama-cpp-python==0.2.29) (1.24.3) Collecting diskcache>=5.6.1 (from llama-cpp-python==0.2.29) Obtaining dependency information for diskcache>=5.6.1 from https://files.pythonhosted.org/packages/3f/27/4570e78fc0bf5ea0ca45eb1de3818a23787af9b390c0b0a0033a1b8236f9/diskcache-5.6.3-py3-none-any.whl.metadata Using cached diskcache-5.6.3-py3-none-any.whl.metadata (20 kB) Using cached diskcache-5.6.3-py3-none-any.whl (45 kB) Building wheels for collected packages: llama-cpp-python Building wheel for llama-cpp-python (pyproject.toml) ... error error: subprocess-exited-with-error

× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [86 lines of output] *** scikit-build-core 0.7.1 using CMake 3.22.1 (wheel) *** Configuring CMake... 2024-01-15 18:32:17,544 - scikit_build_core - WARNING - libdir/ldlibrary: /opt/conda/lib/libpython3.10.a is not a real file! 2024-01-15 18:32:17,544 - scikit_build_core - WARNING - Can't find a Python library, got libdir=/opt/conda/lib, ldlibrary=libpython3.10.a, multiarch=x86_64-linux-gnu, masd=None /usr/bin/cmake: /opt/conda/lib/libcurl.so.4: no version information available (required by /usr/bin/cmake) loading initial cache file /tmp/tmp5gda301i/build/CMakeInit.txt -- The C compiler identification is GNU 11.4.0 -- The CXX compiler identification is GNU 11.4.0 -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working C compiler: /usr/bin/cc - skipped -- Detecting C compile features -- Detecting C compile features - done -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: /usr/bin/c++ - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Found Git: /usr/bin/git (found version "2.34.1") -- Looking for pthread.h -- Looking for pthread.h - found -- Performing Test CMAKE_HAVE_LIBC_PTHREAD -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success -- Found Threads: TRUE -- Found CUDAToolkit: /usr/local/cuda/include (found version "11.8.89") -- cuBLAS found -- The CUDA compiler identification is NVIDIA 11.8.89 -- Detecting CUDA compiler ABI info -- Detecting CUDA compiler ABI info - done -- Check for working CUDA compiler: /usr/local/cuda/bin/nvcc - skipped -- Detecting CUDA compile features -- Detecting CUDA compile features - done -- Using CUDA architectures: 52;61;70 -- CUDA host compiler is GNU 11.4.0

  -- CMAKE_SYSTEM_PROCESSOR: x86_64
  -- x86 detected
  INSTALL TARGETS - target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
  INSTALL TARGETS - target llama has PUBLIC_HEADER files but no PUBLIC_HEADER DESTINATION.
  -- Configuring done
  CMake Error at vendor/llama.cpp/CMakeLists.txt:782 (add_library):
    Target "ggml_shared" links to target "CUDA::cuda_driver" but the target was
    not found.  Perhaps a find_package() call is missing for an IMPORTED
    target, or an ALIAS target is missing?
  
  
  CMake Error at vendor/llama.cpp/CMakeLists.txt:789 (add_library):
    Target "llama" links to target "CUDA::cuda_driver" but the target was not
    found.  Perhaps a find_package() call is missing for an IMPORTED target, or
    an ALIAS target is missing?
  
  
  CMake Error at vendor/llama.cpp/CMakeLists.txt:789 (add_library):
    Target "llama" links to target "CUDA::cuda_driver" but the target was not
    found.  Perhaps a find_package() call is missing for an IMPORTED target, or
    an ALIAS target is missing?
  
  
  CMake Error at vendor/llama.cpp/examples/llava/CMakeLists.txt:20 (add_library):
    Target "llava_shared" links to target "CUDA::cuda_driver" but the target
    was not found.  Perhaps a find_package() call is missing for an IMPORTED
    target, or an ALIAS target is missing?
  
  
  CMake Error at vendor/llama.cpp/examples/llava/CMakeLists.txt:34 (add_executable):
    Target "llava-cli" links to target "CUDA::cuda_driver" but the target was
    not found.  Perhaps a find_package() call is missing for an IMPORTED
    target, or an ALIAS target is missing?
  
  
  CMake Error at vendor/llama.cpp/CMakeLists.txt:756 (add_library):
    Target "ggml" links to target "CUDA::cuda_driver" but the target was not
    found.  Perhaps a find_package() call is missing for an IMPORTED target, or
    an ALIAS target is missing?
  
  
  CMake Error at vendor/llama.cpp/examples/llava/CMakeLists.txt:1 (add_library):
    Target "llava" links to target "CUDA::cuda_driver" but the target was not
    found.  Perhaps a find_package() call is missing for an IMPORTED target, or
    an ALIAS target is missing?
  
  
  -- Generating done
  CMake Generate step failed.  Build files cannot be regenerated correctly.
  
  *** CMake configuration failed
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for llama-cpp-python Failed to build llama-cpp-python ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects

IvoryTower800 avatar Jan 15 '24 18:01 IvoryTower800

Wrong repo. You need to raise the issue at https://github.com/abetlen/llama-cpp-python/issues

askmyteapot avatar Jan 15 '24 22:01 askmyteapot

This issue is stale because it has been open for 30 days with no activity.

github-actions[bot] avatar Mar 18 '24 01:03 github-actions[bot]

any solution yet i have the same problem ?

BahaSlama77 avatar Mar 20 '24 18:03 BahaSlama77

This issue was closed because it has been inactive for 14 days since being marked as stale.

github-actions[bot] avatar May 05 '24 01:05 github-actions[bot]