private-gpt
private-gpt copied to clipboard
libllama.dylib and mac osx building problem
i got this when i ran privateGPT.py
Describe the bug and how to reproduce it Loaded 1 new documents from source_documents Split into 146 chunks of text (max. 500 tokens each) Creating embeddings. May take some minutes... Using embedded DuckDB with persistence: data will be stored in: db_vector Ingestion complete! You can now run privateGPT.py to query your documents (alfred) (base) rohittiwari@Rohits-MacBook-Pro alfredGPT % python privateGPT.py
File "privateGPT.py", line 31 match model_type: ^ SyntaxError: invalid syntax (alfred) (base) rohittiwari@Rohits-MacBook-Pro alfredGPT % python privateGPT.py
Using embedded DuckDB with persistence: data will be stored in: db_vector
Traceback (most recent call last):
File "privateGPT.py", line 74, in
how to solve it?
Got same issue
I'm trying to follow the example in README and got a similar error:
(privategpt) sbslee@Seung-beens-Mac-mini privateGPT % python privateGPT.py
Using embedded DuckDB with persistence: data will be stored in: db
Traceback (most recent call last):
File "/Users/sbslee/Desktop/privateGPT/privateGPT.py", line 76, in <module>
main()
File "/Users/sbslee/Desktop/privateGPT/privateGPT.py", line 36, in main
llm = GPT4All(model=model_path, n_ctx=model_n_ctx, backend='gptj', callbacks=callbacks, verbose=False)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "pydantic/main.py", line 339, in pydantic.main.BaseModel.__init__
File "pydantic/main.py", line 1102, in pydantic.main.validate_model
File "/Users/sbslee/opt/anaconda3/envs/privategpt/lib/python3.11/site-packages/langchain/llms/gpt4all.py", line 133, in validate_environment
from gpt4all import GPT4All as GPT4AllModel
File "/Users/sbslee/opt/anaconda3/envs/privategpt/lib/python3.11/site-packages/gpt4all/__init__.py", line 1, in <module>
from .pyllmodel import LLModel # noqa
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/sbslee/opt/anaconda3/envs/privategpt/lib/python3.11/site-packages/gpt4all/pyllmodel.py", line 55, in <module>
llmodel, llama = load_llmodel_library()
^^^^^^^^^^^^^^^^^^^^^^
File "/Users/sbslee/opt/anaconda3/envs/privategpt/lib/python3.11/site-packages/gpt4all/pyllmodel.py", line 49, in load_llmodel_library
llama_lib = ctypes.CDLL(llama_dir, mode=ctypes.RTLD_GLOBAL)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/sbslee/opt/anaconda3/envs/privategpt/lib/python3.11/ctypes/__init__.py", line 376, in __init__
self._handle = _dlopen(self._name, mode)
^^^^^^^^^^^^^^^^^^^^^^^^^
OSError: dlopen(/Users/sbslee/opt/anaconda3/envs/privategpt/lib/python3.11/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllama.dylib, 10): Symbol not found: __ZNKSt3__115basic_stringbufIcNS_11char_traitsIcEENS_9allocatorIcEEE3strEv
Referenced from: /Users/sbslee/opt/anaconda3/envs/privategpt/lib/python3.11/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllama.dylib (which was built for Mac OS X 12.6)
Expected in: /usr/lib/libc++.1.dylib
(privategpt) sbslee@Seung-beens-Mac-mini privateGPT %
My computers's specs:
- macOS Big Sur
- Version 11.2.1
- Mac mini (m1, 2020)
- Chip Apple M1
- Memory 8 GB
Got same issue on macOS Big Sur v11.7.5
Got same issue on macOS Big Sur 11.3.1
the match
language feature requires python >= 3.10. Check python3 --version, if its < 3.10.0 then that is the problem.
When i ran on with python < 13.10 , got following exception,
File "privateGPT.py", line 31 match model_type: ^ SyntaxError: invalid syntax
Later , upgraded to 13.10 , i see the following
OSError: dlopen(/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllama.dylib, 10): Symbol not found: __ZNKSt3__115basic_stringbufIcNS_11char_traitsIcEENS_9allocatorIcEEE3strEv Referenced from: /Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllama.dylib (which was built for Mac OS X 12.6) Expected in: /usr/lib/libc++.1.dylib
Using macOS BigSur 11.7.6 (20G1231)
my python version is 3.11.2. got exception.
OSError: dlopen(opt/anaconda3/envs/privateGPT/lib/python3.11/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllama.dylib, 10): Symbol not found: __ZNKSt3__115basic_stringbufIcNS_11char_traitsIcEENS_9allocatorIcEEE3strEv Referenced from: opt/anaconda3/envs/privateGPT/lib/python3.11/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllama.dylib (which was built for Mac OS X 12.6) Expected in: /usr/lib/libc++.1.dylib
I found the problem. If you look at my error message above, it included which was built for Mac OS X 12.6
. When I updated my Mac OS, the problem was resolved. Hope this helps.
my python version is 3.11.2. got exception.
OSError: dlopen(opt/anaconda3/envs/privateGPT/lib/python3.11/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllama.dylib, 10): Symbol not found: __ZNKSt3__115basic_stringbufIcNS_11char_traitsIcEENS_9allocatorIcEEE3strEv Referenced from: opt/anaconda3/envs/privateGPT/lib/python3.11/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllama.dylib (which was built for Mac OS X 12.6) Expected in: /usr/lib/libc++.1.dylib
I fix this when I updated my Mac OS to version 13.4
Few steps to remember
- brew install gcc
- xcode-select --install
- xcode installed as well lmao
- python 3.9+
- pip install wheel (optional)
@knowrohit getting same error , using MAC os 11.7.6 big spur. How to resolve this ?