llama-cpp-python icon indicating copy to clipboard operation
llama-cpp-python copied to clipboard

Attribute Error: /llama.cpp/libllama.so: undefined symbol: llama_numa_init

Open VKonanur opened this issue 1 year ago • 3 comments

Expected Behavior

Once, I set the necessary environment variables (export LLAMA_CPP_LIB = /most recent build/libllama.so), the code executes without any error.

Current Behavior

I created a new virtual environment with same dependencies installed (i.e my Python version and llama-cpp-python lib), I'm getting the above attribute error this time.

Environment and Context

Please provide detailed information about your computer setup. This is important in case the issue is not reproducible except for under certain specific conditions.

RHEL x86_64 GNU/Linux

$ python3 --version 3.9.18

$ make --version GNU Make 4.3

$ g++ --version g++ (GCC) 11.4.1 20230605 (Red Hat 11.4.1-2)


# Steps to Reproduce

Just followed up the necessary steps such as exporting the environmental variable and running my streamlit app. 


**Note: Many issues seem to be regarding functional or performance issues / differences with `llama.cpp`. In these cases we need to confirm that you're comparing against the version of `llama.cpp` that was built with your python package, and which parameters you're passing to the context.**



# Failure Logs

AttributeError: /home/llama.cpp/libllama.so: undefined symbol: llama_numa_init
Traceback:
File "/home/lib64/python3.9/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 584, in _run_script
    exec(code, module.__dict__)
File "/home/new_app.py", line 109, in <module>
    main()
File "/home/new_app.py", line 103, in main
    st.session_state.conversation = get_conversation_chain(
File "/home/new_app.py", line 32, in get_conversation_chain
    llm = llamacpp.LlamaCpp(
File "/home/lib64/python3.9/site-packages/langchain_core/load/serializable.py", line 120, in __init__
    super().__init__(**kwargs)
File "/home/lib64/python3.9/site-packages/pydantic/v1/main.py", line 339, in __init__
    values, fields_set, validation_error = validate_model(__pydantic_self__.__class__, data)
File "/home/lib64/python3.9/site-packages/pydantic/v1/main.py", line 1100, in validate_model
    values = validator(cls_, values)
File "/home/lib64/python3.9/site-packages/langchain_community/llms/llamacpp.py", line 140, in validate_environment
    from llama_cpp import Llama, LlamaGrammar
File "/home/lib64/python3.9/site-packages/llama_cpp/__init__.py", line 1, in <module>
    from .llama_cpp import *
File "/home/lib64/python3.9/site-packages/llama_cpp/llama_cpp.py", line 1040, in <module>
    def llama_numa_init(numa: int, /):
File "/home/lib64/python3.9/site-packages/llama_cpp/llama_cpp.py", line 121, in decorator
    func = getattr(lib, name)
File "/usr/lib64/python3.9/ctypes/__init__.py", line 387, in __getattr__
    func = self.__getitem__(name)
File "/usr/lib64/python3.9/ctypes/__init__.py", line 392, in __getitem__
    func = self._FuncPtr((name_or_ordinal, self))

Environment info:

llama-cpp-python$ git log | head -1 commit a420f9608bbd3b76e8bfbb6cdcf4d3fa69efe5c0

llama-cpp-python$ python3 --version Python 3.9.18

VKonanur avatar Apr 16 '24 23:04 VKonanur

Getting this error on aarch64 as well

Traceback (most recent call last): File "", line 198, in _run_module_as_main File "", line 88, in _run_code File "/usr/local/lib/python3.11/site-packages/uvicorn/main.py", line 4, in uvicorn.main() File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1157, in call return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1078, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1434, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/click/core.py", line 783, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/uvicorn/main.py", line 409, in main run( File "/usr/local/lib/python3.11/site-packages/uvicorn/main.py", line 575, in run server.run() File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 65, in run return asyncio.run(self.serve(sockets=sockets)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib64/python3.11/asyncio/runners.py", line 190, in run return runner.run(main) ^^^^^^^^^^^^^^^^ File "/usr/lib64/python3.11/asyncio/runners.py", line 118, in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib64/python3.11/asyncio/base_events.py", line 653, in run_until_complete return future.result() ^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 69, in serve await self._serve(sockets) File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 76, in _serve config.load() File "/usr/local/lib/python3.11/site-packages/uvicorn/config.py", line 433, in load self.loaded_app = import_from_string(self.app) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/uvicorn/importer.py", line 19, in import_from_string module = importlib.import_module(module_str) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib64/python3.11/importlib/init.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "", line 1204, in _gcd_import File "", line 1176, in _find_and_load File "", line 1126, in _find_and_load_unlocked File "", line 241, in _call_with_frames_removed File "", line 1204, in _gcd_import File "", line 1176, in _find_and_load File "", line 1126, in _find_and_load_unlocked File "", line 241, in _call_with_frames_removed File "", line 1204, in _gcd_import File "", line 1176, in _find_and_load File "", line 1147, in _find_and_load_unlocked File "", line 690, in _load_unlocked File "", line 940, in exec_module File "", line 241, in _call_with_frames_removed File "/usr/src/odsc-llama-cpp-python/llama_cpp/init.py", line 1, in from .llama_cpp import * File "/usr/src/odsc-llama-cpp-python/llama_cpp/llama_cpp.py", line 1035, in @ctypes_function( ^^^^^^^^^^^^^^^^ File "/usr/src/odsc-llama-cpp-python/llama_cpp/llama_cpp.py", line 121, in decorator func = getattr(lib, name) ^^^^^^^^^^^^^^^^^^ File "/usr/lib64/python3.11/ctypes/init.py", line 389, in getattr func = self.getitem(name) ^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib64/python3.11/ctypes/init.py", line 394, in getitem func = self._FuncPtr((name_or_ordinal, self)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ AttributeError: /usr/src/odsc-llama-cpp-python/llama_cpp/libllama.so: undefined symbol: llama_numa_init

gargnipungarg avatar Apr 24 '24 15:04 gargnipungarg

I'd recommend getting a new clone of llamacpp and remaking it

VKonanur avatar Apr 24 '24 19:04 VKonanur

On linux you can run readelf -Ws --dyn-syms libllama.so to check the exported symbols but yes I believe the issue is in the build step here for llama.cpp as a shared library.

abetlen avatar Apr 25 '24 06:04 abetlen