Misc. bug: llama-cli: error while loading shared libraries: libllama.so: cannot open shared object file: No such file or directory
Name and Version
version: 4493 (9c8dcefe) built with cc (Ubuntu 13.3.0-6ubuntu2~24.04) 13.3.0 for x86_64-linux-g
Operating systems
Linux
Which llama.cpp modules do you know to be affected?
No response
Command line
./llama-server
Problem description & steps to reproduce
I get this error when I run llama-server,
llama-cli: error while loading shared libraries: libllama.so: cannot open shared object file: No such file or directory llama.cpp/build/bin/llama-server: error while loading shared libraries: libggml.so: cannot open shared object file: No such file or directory
It's a minor issue i was able to fix it with
export LD_LIBRARY_PATH=llama.cpp/build/ggml/src:llama.cpp/build/src/:$LD_LIBRARY_PATH.
It worked without doing this in which ever branch I was using before this.
I built it using
cmake -B build
cmake --build build --config Release
First Bad Commit
No response
Relevant log output
I'm also seeing this issue in Docker (sha256:ca8b4ce74dbd2a108caa80be90bd8bbdeacd8bc9f16052f1c550e8e5d1e38db3) and with the latest binary release b4529.
Also having this, anyone working to fix this ?
Probably the same issue: https://github.com/ggerganov/llama.cpp/issues/11321#issuecomment-2610041111
#10902
Using latest release ubuntu candidate downloaded from github llama-b4572-bin-ubuntu-x64.zip It gets this error.
Can confirm this is also fixed in the Docker version. Thanks @ggerganov !
I am getting this is b4575
Getting this also in 4585 prebuilt binaries
I'm still getting this with release b4681 downloaded this morning. llama-server will not reference the libllama.so in the same directory after unpacking:
/home/clafollett/repositories/project/.bin/llama.cpp/build/bin/llama-server: error while loading shared libraries: libllama.so: cannot open shared object file: No such file or directory
Any ideas on how to solve this? I was struggling getting the CMake build to work on WSL and decided to switch to prebuilt binaries without CUDA support for now. I have been able to get the Windows and Mac versions to work on our laptops but the Linux binaries are not working for me.
Example:
export LLAMA_CPP_LIB="$(pwd)/.bin/llama.cpp/build/bin"
$LLAMA_CPPLIB/llama-server \
-m "./models/DeepSeek-R1-Distill-Qwen-14B-Q8_0.gguf" \
--cache-type-k q8_0 \
--cache-type-v q8_0 \
--ctx-size 40960 \
--gpu-layers 30 \
--threads 10 \
--parallel 2
Doing a little more digging, it looks like maybe it's the embedded Library runpath value. Here is readelf spits out:
readelf -d llama-server
Dynamic section at offset 0x21ad08 contains 36 entries:
Tag Type Name/Value
0x0000000000000001 (NEEDED) Shared library: [libcurl.so.4]
0x0000000000000001 (NEEDED) Shared library: [libllama.so]
0x0000000000000001 (NEEDED) Shared library: [libggml.so]
0x0000000000000001 (NEEDED) Shared library: [libggml-base.so]
0x0000000000000001 (NEEDED) Shared library: [libstdc++.so.6]
0x0000000000000001 (NEEDED) Shared library: [libm.so.6]
0x0000000000000001 (NEEDED) Shared library: [libgcc_s.so.1]
0x0000000000000001 (NEEDED) Shared library: [libc.so.6]
0x0000000000000001 (NEEDED) Shared library: [ld-linux-x86-64.so.2]
0x000000000000001d (RUNPATH) Library runpath: [/home/runner/work/llama.cpp/llama.cpp/build/bin:]
0x000000000000000c (INIT) 0x1a000
0x000000000000000d (FINI) 0x1b9b88
0x0000000000000019 (INIT_ARRAY) 0x218ce0
0x000000000000001b (INIT_ARRAYSZ) 56 (bytes)
0x000000000000001a (FINI_ARRAY) 0x218d18
0x000000000000001c (FINI_ARRAYSZ) 8 (bytes)
0x000000006ffffef5 (GNU_HASH) 0x3b0
0x0000000000000005 (STRTAB) 0x4de0
0x0000000000000006 (SYMTAB) 0xbe0
0x000000000000000a (STRSZ) 35929 (bytes)
0x000000000000000b (SYMENT) 24 (bytes)
0x0000000000000015 (DEBUG) 0x0
0x0000000000000003 (PLTGOT) 0x21bf88
0x0000000000000002 (PLTRELSZ) 9360 (bytes)
0x0000000000000014 (PLTREL) RELA
0x0000000000000017 (JMPREL) 0x176f0
0x0000000000000007 (RELA) 0xe270
0x0000000000000008 (RELASZ) 38016 (bytes)
0x0000000000000009 (RELAENT) 24 (bytes)
0x000000000000001e (FLAGS) BIND_NOW
0x000000006ffffffb (FLAGS_1) Flags: NOW PIE
0x000000006ffffffe (VERNEED) 0xdfc0
0x000000006fffffff (VERNEEDNUM) 6
0x000000006ffffff0 (VERSYM) 0xda3a
0x000000006ffffff9 (RELACOUNT) 1197
0x0000000000000000 (NULL) 0x0
I'm not certain if this value has any run time meaning but I do not have the specified path locally. but running ldd shows the paths do actually resolve local to llama-server
└─▪ ldd llama-server
linux-vdso.so.1 (0x00007ffdf6df2000)
libcurl.so.4 => /lib/x86_64-linux-gnu/libcurl.so.4 (0x00007f83933bb000)
libllama.so (0x00007f839323b000)
libggml.so (0x00007f839322c000)
libggml-base.so (0x00007f8393158000)
libstdc++.so.6 => /lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007f8392eda000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f8392def000)
libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007f8392dc1000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f8392baf000)
/lib64/ld-linux-x86-64.so.2 (0x00007f83937d1000)
libnghttp2.so.14 => /lib/x86_64-linux-gnu/libnghttp2.so.14 (0x00007f8392b84000)
libidn2.so.0 => /lib/x86_64-linux-gnu/libidn2.so.0 (0x00007f8392b62000)
librtmp.so.1 => /lib/x86_64-linux-gnu/librtmp.so.1 (0x00007f8392b44000)
libssh.so.4 => /lib/x86_64-linux-gnu/libssh.so.4 (0x00007f8392ad1000)
libpsl.so.5 => /lib/x86_64-linux-gnu/libpsl.so.5 (0x00007f8392abd000)
libssl.so.3 => /lib/x86_64-linux-gnu/libssl.so.3 (0x00007f8392a13000)
libcrypto.so.3 => /lib/x86_64-linux-gnu/libcrypto.so.3 (0x00007f8392500000)
libgssapi_krb5.so.2 => /lib/x86_64-linux-gnu/libgssapi_krb5.so.2 (0x00007f83924ac000)
libldap.so.2 => /lib/x86_64-linux-gnu/libldap.so.2 (0x00007f839244f000)
liblber.so.2 => /lib/x86_64-linux-gnu/liblber.so.2 (0x00007f839243d000)
libzstd.so.1 => /lib/x86_64-linux-gnu/libzstd.so.1 (0x00007f8392383000)
libbrotlidec.so.1 => /lib/x86_64-linux-gnu/libbrotlidec.so.1 (0x00007f8392375000)
libz.so.1 => /lib/x86_64-linux-gnu/libz.so.1 (0x00007f8392359000)
libggml-cpu.so (0x00007f83922a1000)
libggml-rpc.so (0x00007f839228b000)
libunistring.so.5 => /lib/x86_64-linux-gnu/libunistring.so.5 (0x00007f83920de000)
libgnutls.so.30 => /lib/x86_64-linux-gnu/libgnutls.so.30 (0x00007f8391ee4000)
libhogweed.so.6 => /lib/x86_64-linux-gnu/libhogweed.so.6 (0x00007f8391e9c000)
libnettle.so.8 => /lib/x86_64-linux-gnu/libnettle.so.8 (0x00007f8391e47000)
libgmp.so.10 => /lib/x86_64-linux-gnu/libgmp.so.10 (0x00007f8391dc1000)
libkrb5.so.3 => /lib/x86_64-linux-gnu/libkrb5.so.3 (0x00007f8391cf8000)
libk5crypto.so.3 => /lib/x86_64-linux-gnu/libk5crypto.so.3 (0x00007f8391ccc000)
libcom_err.so.2 => /lib/x86_64-linux-gnu/libcom_err.so.2 (0x00007f8391cc6000)
libkrb5support.so.0 => /lib/x86_64-linux-gnu/libkrb5support.so.0 (0x00007f8391cb9000)
libsasl2.so.2 => /lib/x86_64-linux-gnu/libsasl2.so.2 (0x00007f8391c9d000)
libbrotlicommon.so.1 => /lib/x86_64-linux-gnu/libbrotlicommon.so.1 (0x00007f8391c7a000)
libgomp.so.1 => /lib/x86_64-linux-gnu/libgomp.so.1 (0x00007f8391c24000)
libp11-kit.so.0 => /lib/x86_64-linux-gnu/libp11-kit.so.0 (0x00007f8391a80000)
libtasn1.so.6 => /lib/x86_64-linux-gnu/libtasn1.so.6 (0x00007f8391a6a000)
libkeyutils.so.1 => /lib/x86_64-linux-gnu/libkeyutils.so.1 (0x00007f8391a61000)
libresolv.so.2 => /lib/x86_64-linux-gnu/libresolv.so.2 (0x00007f8391a4e000)
libffi.so.8 => /lib/x86_64-linux-gnu/libffi.so.8 (0x00007f8391a42000)
I was able to get the llama-server to run but only after changing directory into the root llama.cpp/build/bin dir after unpacking and setting chmod +x llama-server on the executable. It will not load if I run the executable from outside of the root bin dir. I suspect it is the above mentioned RUNPATH. I think it should have the $ORIGIN set in the path so it can look up in the same root as the executable.
cmake -B build cmake --build build --config Release cd build make install ldconfig
then everything llama-xxx should be ok
This issue was closed because it has been inactive for 14 days since being marked as stale.