open-interpreter icon indicating copy to clipboard operation
open-interpreter copied to clipboard

macos 14 sonoma Incompatibility Issue with libllama.dylib on Apple Silicon

Open babyzmz opened this issue 2 years ago • 5 comments

Describe the bug

Upon reviewing the error logs, it's evident that there is a compatibility issue when attempting to load the shared library libllama.dylib. The core of the problem lies in the architecture mismatch:

The user's machine operates on the arm64 architecture, a characteristic of Macs with Apple Silicon, such as the M1 chip. The libllama.dylib in question, however, is compiled for the x86_64 architecture. This mismatch results in an error when the system tries to load the library, as indicated by the following segment from the log:

OSError: dlopen(/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/libllama.dylib, 0x0006): tried: '/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64'))

Reproduce

after run Local LLM interface package not found. Install llama-cpp-python? (Y/n): y

Expected behavior

Model found at /Users/richiezhang/Library/Application Support/Open
Interpreter/models/codellama-7b-instruct.Q4_K_M.gguf
[?] Local LLM interface package not found. Install llama-cpp-python? (Y/n): y

Requirement already satisfied: llama-cpp-python in ./anaconda3/lib/python3.11/site-packages (0.2.6) Requirement already satisfied: typing-extensions>=4.5.0 in ./anaconda3/lib/python3.11/site-packages (from llama-cpp-python) (4.7.1) Requirement already satisfied: numpy>=1.20.0 in ./anaconda3/lib/python3.11/site-packages (from llama-cpp-python) (1.25.2) Requirement already satisfied: diskcache>=5.6.1 in ./anaconda3/lib/python3.11/site-packages (from llama-cpp-python) (5.6.3) Traceback (most recent call last): File "/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/llama_cpp.py", line 67, in _load_shared_library return ctypes.CDLL(str(_lib_path), **cdll_args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/richiezhang/anaconda3/lib/python3.11/ctypes/init.py", line 376, in init self._handle = _dlopen(self._name, mode) ^^^^^^^^^^^^^^^^^^^^^^^^^ OSError: dlopen(/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/libllama.dylib, 0x0006): tried: '/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/libllama.dylib' (no such file), '/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/Users/richiezhang/anaconda3/lib/python3.11/site-packages/interpreter/get_hf_llm.py", line 164, in get_hf_llm from llama_cpp import Llama File "/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/init.py", line 1, in from .llama_cpp import * File "/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/llama_cpp.py", line 80, in _lib = _load_shared_library(_lib_base_name) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/llama_cpp.py", line 69, in _load_shared_library raise RuntimeError(f"Failed to load shared library '{_lib_path}': {e}") RuntimeError: Failed to load shared library '/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/libllama.dylib': dlopen(/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/libllama.dylib, 0x0006): tried: '/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/libllama.dylib' (no such file), '/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/llama_cpp.py", line 67, in _load_shared_library return ctypes.CDLL(str(_lib_path), **cdll_args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/richiezhang/anaconda3/lib/python3.11/ctypes/init.py", line 376, in init self._handle = _dlopen(self._name, mode) ^^^^^^^^^^^^^^^^^^^^^^^^^ OSError: dlopen(/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/libllama.dylib, 0x0006): tried: '/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/libllama.dylib' (no such file), '/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/Users/richiezhang/anaconda3/lib/python3.11/site-packages/interpreter/interpreter.py", line 323, in chat self.llama_instance = get_hf_llm(self.model, self.debug_mode, self.context_window) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/richiezhang/anaconda3/lib/python3.11/site-packages/interpreter/get_hf_llm.py", line 223, in get_hf_llm from llama_cpp import Llama File "/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/init.py", line 1, in from .llama_cpp import * File "/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/llama_cpp.py", line 80, in _lib = _load_shared_library(_lib_base_name) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/llama_cpp.py", line 69, in _load_shared_library raise RuntimeError(f"Failed to load shared library '{_lib_path}': {e}") RuntimeError: Failed to load shared library '/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/libllama.dylib': dlopen(/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/libllama.dylib, 0x0006): tried: '/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/libllama.dylib' (no such file), '/Users/richiezhang/anaconda3/lib/python3.11/site-packages/llama_cpp/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64'))

▌ Failed to install TheBloke/CodeLlama-7B-Instruct-GGUF.

Common Fixes: You can follow our simple setup docs at the link below to resolve common errors.

Screenshots

No response

Open Interpreter version

v0.1.4

Python version

3.11.5

Operating System name and version

MacOS 14 sonoma

Additional context

No response

babyzmz avatar Sep 16 '23 09:09 babyzmz

I believe [re]installing the llama-cpp-python package before running interpreter will help, once the llama-cpp-python package is updated to match the macOS 14 Sonoa, which is as of today still in Preview/early access. python -m pip install --upgrade --force-reinstall llama-cpp-python

Running the above before executing interpreter on the latest patch level of macOS 13 works on a M1, tested it just now.

Not an open interpreter bug, I reason?

fredrik-hansen avatar Sep 18 '23 06:09 fredrik-hansen

Though I did exactly as what "https://github.com/KillianLucas/open-interpreter/blob/main/docs/MACOS.md" told me, it still raised the same error. I believe it's a problem with llama-cpp-python package since there are some similar issues under their repo "https://github.com/abetlen/llama-cpp-python/issues?q=+but+is+an+incompatible+architecture+%28have+%27x86_64%27%2C+need+%27arm64%27%29%29"

deerestFarther avatar Sep 19 '23 14:09 deerestFarther

Try this command instead to reinstall and rebuild the llama-cpp-python package:

CMAKE_ARGS="-DCMAKE_OSX_ARCHITECTURES=arm64 -DLLAMA_METAL=on" FORCE_CMAKE=1 pip install --upgrade --verbose --force-reinstall --no-cache-dir llama-cpp-python

It works on my M1 Max Mac Studio. Hopes it would help.

deerestFarther avatar Sep 19 '23 14:09 deerestFarther

Is there a way to use pip3/python3?

PMLS3 avatar Sep 20 '23 07:09 PMLS3

@babyzmz Did you ever find a solution to this?

alexandriabindas avatar Dec 01 '23 01:12 alexandriabindas

External, deprecated ooba, closing

Notnaton avatar Feb 12 '24 17:02 Notnaton