llama-cpp-python
llama-cpp-python copied to clipboard
Pre-built cpu wheel does not work on Ubuntu due to libc.musl dependency
Prerequisites
Please answer the following questions for yourself before submitting an issue.
- [x] I am running the latest code. Development is very rapid so there are no tagged versions as of now.
- [x] I carefully followed the README.md.
- [x] I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
- [x] I reviewed the Discussions, and have a new bug or useful enhancement to share.
Expected Behavior
The pre-built cpu wheels should work out of the box on Ubuntu
Current Behavior
The pre-built cpu wheels depend on libc.musl, which is generally not available on most of the popular linux distributions.
The following external shared libraries are required by the wheel:
{
"libc.musl-x86_64.so.1": null,
"libgcc_s.so.1": null,
"libggml.so": null,
"libgomp.so.1": null,
"libllama.so": null,
"libstdc++.so.6": null
}
Attempting to import llama_cpp results in the following error:
RuntimeError: Failed to load shared library '/usr/local/lib/python3.10/dist-packages/llama_cpp/lib/libllama.so': libc.musl-x86_64.so.1: cannot open shared object file: No such file or directory
At the same time, the cuda wheel does not depend on musl and works out of the box on the same system.
The following external shared libraries are required by the wheel:
{
"libc.so.6": null,
"libcublas.so.12": null,
"libcuda.so.1": null,
"libcudart.so.12": null,
"libgcc_s.so.1": null,
"libggml.so": null,
"libgomp.so.1": null,
"libllama.so": null,
"libm.so.6": null,
"libstdc++.so.6": null
}
Environment and Context
Ubuntu 22.04 / Ubuntu 24.04
Llama-cpp-python 0.2.82 / 0.2.83
Steps to Reproduce
Please provide detailed steps for reproducing the issue. We are not sitting in front of your screen, so the more detail the better.
- Install a pre-built cpu wheel
- try to import llama_cpp
Failure Logs
OSError Traceback (most recent call last)
[/usr/local/lib/python3.10/dist-packages/llama_cpp/llama_cpp.py](https://localhost:8080/#) in _load_shared_library(lib_base_name)
74 try:
---> 75 return ctypes.CDLL(str(_lib_path), **cdll_args) # type: ignore
76 except Exception as e:
[/usr/lib/python3.10/ctypes/__init__.py](https://localhost:8080/#) in __init__(self, name, mode, handle, use_errno, use_last_error, winmode)
373 if handle is None:
--> 374 self._handle = _dlopen(self._name, mode)
375 else:
OSError: libc.musl-x86_64.so.1: cannot open shared object file: No such file or directory
During handling of the above exception, another exception occurred:
RuntimeError Traceback (most recent call last)
[<ipython-input-17-c8c7f50702fd>](https://localhost:8080/#) in <cell line: 1>()
----> 1 import llama_cpp
[/usr/local/lib/python3.10/dist-packages/llama_cpp/__init__.py](https://localhost:8080/#) in <module>
----> 1 from .llama_cpp import *
2 from .llama import *
3
4 __version__ = "0.2.83"
[/usr/local/lib/python3.10/dist-packages/llama_cpp/llama_cpp.py](https://localhost:8080/#) in <module>
86
87 # Load the library
---> 88 _lib = _load_shared_library(_lib_base_name)
89
90
[/usr/local/lib/python3.10/dist-packages/llama_cpp/llama_cpp.py](https://localhost:8080/#) in _load_shared_library(lib_base_name)
75 return ctypes.CDLL(str(_lib_path), **cdll_args) # type: ignore
76 except Exception as e:
---> 77 raise RuntimeError(f"Failed to load shared library '{_lib_path}': {e}")
78
79 raise FileNotFoundError(
RuntimeError: Failed to load shared library '/usr/local/lib/python3.10/dist-packages/llama_cpp/lib/libllama.so': libc.musl-x86_64.so.1: cannot open shared object file: No such file or directory
Yeah, for some reason the wheels are being build with musl instead of glibc. My fix was to do this:
apt install musl-dev
ln -s /usr/lib/x86_64-linux-musl/libc.so /lib/libc.musl-x86_64.so.1
@gaby yes, I did the same and it works fine. But I would consider that to be more of a work around rather than a long-term solution.
I would definitely consider this a workaround. I am using this lib to reduce a docker image size, so I don't want to install musl.
Probably related: https://github.com/abetlen/llama-cpp-python/issues/1507
It looks like the issue has been fixed in the latest version (0.2.88). Thanks!
@arpesenti I don't think it was fixed.
The following external shared libraries are required by the wheel:
{
"libc.musl-x86_64.so.1": null,
"libgcc_s.so.1": null,
"libggml.so": null,
"libgomp.so.1": null,
"libllama.so": null,
"libstdc++.so.6": null
}
I was getting the same error. Downgrading to llama_cpp_python==0.3.1 seems to fix the issue.
Hi! Can someone please verify except @Linguiniotta that the issue is solved in versions >= 0.3.1 ? Also, I don't see any released wheel files that are >=0.3.2 from here
Downgrading to llama_cpp_python==0.3.1 does not solve the issue. I have just tried that
Can confirm, neither 0.3.2 nor 0.3.1 work in a UBI9 container as they still look for musl instead of glibc.
RUN python -m pip install -v "llama-cpp-python==0.3.1" --extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cpu
#15 0.529 Looking in indexes: https://pypi.org/simple, https://abetlen.github.io/llama-cpp-python/whl/cpu
#15 1.043 Collecting llama-cpp-python==0.3.1
#15 1.701 Downloading https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.1/llama_cpp_python-0.3.1-cp312-cp312-linux_x86_64.whl (3.5 MB)
OSError: libc.musl-x86_64.so.1: cannot open shared object file: No such file or directory
@abetlen any thoughts here? are the wheels being built on alpine and thus getting linked to muslc?