Attribute Error: /llama.cpp/libllama.so: undefined symbol: llama_numa_init
Expected Behavior
Once, I set the necessary environment variables (export LLAMA_CPP_LIB = /most recent build/libllama.so), the code executes without any error.
Current Behavior
I created a new virtual environment with same dependencies installed (i.e my Python version and llama-cpp-python lib), I'm getting the above attribute error this time.
Environment and Context
Please provide detailed information about your computer setup. This is important in case the issue is not reproducible except for under certain specific conditions.
RHEL x86_64 GNU/Linux
$ python3 --version 3.9.18
$ make --version GNU Make 4.3
$ g++ --version g++ (GCC) 11.4.1 20230605 (Red Hat 11.4.1-2)
# Steps to Reproduce
Just followed up the necessary steps such as exporting the environmental variable and running my streamlit app.
**Note: Many issues seem to be regarding functional or performance issues / differences with `llama.cpp`. In these cases we need to confirm that you're comparing against the version of `llama.cpp` that was built with your python package, and which parameters you're passing to the context.**
# Failure Logs
AttributeError: /home/llama.cpp/libllama.so: undefined symbol: llama_numa_init
Traceback:
File "/home/lib64/python3.9/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 584, in _run_script
exec(code, module.__dict__)
File "/home/new_app.py", line 109, in <module>
main()
File "/home/new_app.py", line 103, in main
st.session_state.conversation = get_conversation_chain(
File "/home/new_app.py", line 32, in get_conversation_chain
llm = llamacpp.LlamaCpp(
File "/home/lib64/python3.9/site-packages/langchain_core/load/serializable.py", line 120, in __init__
super().__init__(**kwargs)
File "/home/lib64/python3.9/site-packages/pydantic/v1/main.py", line 339, in __init__
values, fields_set, validation_error = validate_model(__pydantic_self__.__class__, data)
File "/home/lib64/python3.9/site-packages/pydantic/v1/main.py", line 1100, in validate_model
values = validator(cls_, values)
File "/home/lib64/python3.9/site-packages/langchain_community/llms/llamacpp.py", line 140, in validate_environment
from llama_cpp import Llama, LlamaGrammar
File "/home/lib64/python3.9/site-packages/llama_cpp/__init__.py", line 1, in <module>
from .llama_cpp import *
File "/home/lib64/python3.9/site-packages/llama_cpp/llama_cpp.py", line 1040, in <module>
def llama_numa_init(numa: int, /):
File "/home/lib64/python3.9/site-packages/llama_cpp/llama_cpp.py", line 121, in decorator
func = getattr(lib, name)
File "/usr/lib64/python3.9/ctypes/__init__.py", line 387, in __getattr__
func = self.__getitem__(name)
File "/usr/lib64/python3.9/ctypes/__init__.py", line 392, in __getitem__
func = self._FuncPtr((name_or_ordinal, self))
Environment info:
llama-cpp-python$ git log | head -1 commit a420f9608bbd3b76e8bfbb6cdcf4d3fa69efe5c0
llama-cpp-python$ python3 --version Python 3.9.18
Getting this error on aarch64 as well
Traceback (most recent call last):
File "
I'd recommend getting a new clone of llamacpp and remaking it
On linux you can run readelf -Ws --dyn-syms libllama.so to check the exported symbols but yes I believe the issue is in the build step here for llama.cpp as a shared library.