llama-cpp-python icon indicating copy to clipboard operation
llama-cpp-python copied to clipboard

Pre-built cpu wheel does not work on Ubuntu due to libc.musl dependency

Open OKUA1 opened this issue 1 year ago • 9 comments

Prerequisites

Please answer the following questions for yourself before submitting an issue.

  • [x] I am running the latest code. Development is very rapid so there are no tagged versions as of now.
  • [x] I carefully followed the README.md.
  • [x] I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • [x] I reviewed the Discussions, and have a new bug or useful enhancement to share.

Expected Behavior

The pre-built cpu wheels should work out of the box on Ubuntu

Current Behavior

The pre-built cpu wheels depend on libc.musl, which is generally not available on most of the popular linux distributions.

The following external shared libraries are required by the wheel:
{
    "libc.musl-x86_64.so.1": null,
    "libgcc_s.so.1": null,
    "libggml.so": null,
    "libgomp.so.1": null,
    "libllama.so": null,
    "libstdc++.so.6": null
}

Attempting to import llama_cpp results in the following error:

RuntimeError: Failed to load shared library '/usr/local/lib/python3.10/dist-packages/llama_cpp/lib/libllama.so': libc.musl-x86_64.so.1: cannot open shared object file: No such file or directory

At the same time, the cuda wheel does not depend on musl and works out of the box on the same system.

The following external shared libraries are required by the wheel:
{
    "libc.so.6": null,
    "libcublas.so.12": null,
    "libcuda.so.1": null,
    "libcudart.so.12": null,
    "libgcc_s.so.1": null,
    "libggml.so": null,
    "libgomp.so.1": null,
    "libllama.so": null,
    "libm.so.6": null,
    "libstdc++.so.6": null
}

Environment and Context

Ubuntu 22.04 / Ubuntu 24.04

Llama-cpp-python 0.2.82 / 0.2.83

Steps to Reproduce

Please provide detailed steps for reproducing the issue. We are not sitting in front of your screen, so the more detail the better.

  1. Install a pre-built cpu wheel
  2. try to import llama_cpp

Failure Logs

OSError                                   Traceback (most recent call last)

[/usr/local/lib/python3.10/dist-packages/llama_cpp/llama_cpp.py](https://localhost:8080/#) in _load_shared_library(lib_base_name)
     74             try:
---> 75                 return ctypes.CDLL(str(_lib_path), **cdll_args)  # type: ignore
     76             except Exception as e:

[/usr/lib/python3.10/ctypes/__init__.py](https://localhost:8080/#) in __init__(self, name, mode, handle, use_errno, use_last_error, winmode)
    373         if handle is None:
--> 374             self._handle = _dlopen(self._name, mode)
    375         else:

OSError: libc.musl-x86_64.so.1: cannot open shared object file: No such file or directory


During handling of the above exception, another exception occurred:

RuntimeError                              Traceback (most recent call last)

[<ipython-input-17-c8c7f50702fd>](https://localhost:8080/#) in <cell line: 1>()
----> 1 import llama_cpp

[/usr/local/lib/python3.10/dist-packages/llama_cpp/__init__.py](https://localhost:8080/#) in <module>
----> 1 from .llama_cpp import *
      2 from .llama import *
      3 
      4 __version__ = "0.2.83"

[/usr/local/lib/python3.10/dist-packages/llama_cpp/llama_cpp.py](https://localhost:8080/#) in <module>
     86 
     87 # Load the library
---> 88 _lib = _load_shared_library(_lib_base_name)
     89 
     90 

[/usr/local/lib/python3.10/dist-packages/llama_cpp/llama_cpp.py](https://localhost:8080/#) in _load_shared_library(lib_base_name)
     75                 return ctypes.CDLL(str(_lib_path), **cdll_args)  # type: ignore
     76             except Exception as e:
---> 77                 raise RuntimeError(f"Failed to load shared library '{_lib_path}': {e}")
     78 
     79     raise FileNotFoundError(

RuntimeError: Failed to load shared library '/usr/local/lib/python3.10/dist-packages/llama_cpp/lib/libllama.so': libc.musl-x86_64.so.1: cannot open shared object file: No such file or directory

OKUA1 avatar Jul 27 '24 21:07 OKUA1

Yeah, for some reason the wheels are being build with musl instead of glibc. My fix was to do this:

apt install musl-dev
ln -s /usr/lib/x86_64-linux-musl/libc.so /lib/libc.musl-x86_64.so.1

gaby avatar Jul 28 '24 16:07 gaby

@gaby yes, I did the same and it works fine. But I would consider that to be more of a work around rather than a long-term solution.

OKUA1 avatar Jul 28 '24 17:07 OKUA1

I would definitely consider this a workaround. I am using this lib to reduce a docker image size, so I don't want to install musl.

Probably related: https://github.com/abetlen/llama-cpp-python/issues/1507

jcuenod avatar Aug 05 '24 13:08 jcuenod

It looks like the issue has been fixed in the latest version (0.2.88). Thanks!

arpesenti avatar Aug 14 '24 08:08 arpesenti

@arpesenti I don't think it was fixed.

The following external shared libraries are required by the wheel:
{
    "libc.musl-x86_64.so.1": null,
    "libgcc_s.so.1": null,
    "libggml.so": null,
    "libgomp.so.1": null,
    "libllama.so": null,
    "libstdc++.so.6": null
}

OKUA1 avatar Aug 15 '24 13:08 OKUA1

I was getting the same error. Downgrading to llama_cpp_python==0.3.1 seems to fix the issue.

Linguiniotta avatar Nov 21 '24 14:11 Linguiniotta

Hi! Can someone please verify except @Linguiniotta that the issue is solved in versions >= 0.3.1 ? Also, I don't see any released wheel files that are >=0.3.2 from here

gil-frenkel-HH avatar Dec 10 '24 08:12 gil-frenkel-HH

Downgrading to llama_cpp_python==0.3.1 does not solve the issue. I have just tried that

ga92xug avatar Jan 17 '25 10:01 ga92xug

Can confirm, neither 0.3.2 nor 0.3.1 work in a UBI9 container as they still look for musl instead of glibc.

RUN python -m pip install -v "llama-cpp-python==0.3.1" --extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cpu
#15 0.529 Looking in indexes: https://pypi.org/simple, https://abetlen.github.io/llama-cpp-python/whl/cpu
#15 1.043 Collecting llama-cpp-python==0.3.1
#15 1.701   Downloading https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.1/llama_cpp_python-0.3.1-cp312-cp312-linux_x86_64.whl (3.5 MB)
OSError: libc.musl-x86_64.so.1: cannot open shared object file: No such file or directory

ligius- avatar Aug 06 '25 00:08 ligius-

@abetlen any thoughts here? are the wheels being built on alpine and thus getting linked to muslc?

mhamann avatar Nov 20 '25 22:11 mhamann