private-gpt icon indicating copy to clipboard operation
private-gpt copied to clipboard

libllama.dylib and mac osx building problem

Open knowrohit opened this issue 1 year ago • 7 comments

i got this when i ran privateGPT.py

Describe the bug and how to reproduce it Loaded 1 new documents from source_documents Split into 146 chunks of text (max. 500 tokens each) Creating embeddings. May take some minutes... Using embedded DuckDB with persistence: data will be stored in: db_vector Ingestion complete! You can now run privateGPT.py to query your documents (alfred) (base) rohittiwari@Rohits-MacBook-Pro alfredGPT % python privateGPT.py

File "privateGPT.py", line 31 match model_type: ^ SyntaxError: invalid syntax (alfred) (base) rohittiwari@Rohits-MacBook-Pro alfredGPT % python privateGPT.py

Using embedded DuckDB with persistence: data will be stored in: db_vector Traceback (most recent call last): File "privateGPT.py", line 74, in main() File "privateGPT.py", line 34, in main llm = GPT4All(model=model_path, n_ctx=model_n_ctx, backend='gptj', callbacks=callbacks, verbose=False) File "pydantic/main.py", line 339, in pydantic.main.BaseModel.init File "pydantic/main.py", line 1102, in pydantic.main.validate_model File "/Users/rohittiwari/Downloads/alfredGPT/alfred/lib/python3.8/site-packages/langchain/llms/gpt4all.py", line 133, in validate_environment from gpt4all import GPT4All as GPT4AllModel File "/Users/rohittiwari/Downloads/alfredGPT/alfred/lib/python3.8/site-packages/gpt4all/init.py", line 1, in from .pyllmodel import LLModel # noqa File "/Users/rohittiwari/Downloads/alfredGPT/alfred/lib/python3.8/site-packages/gpt4all/pyllmodel.py", line 55, in llmodel, llama = load_llmodel_library() File "/Users/rohittiwari/Downloads/alfredGPT/alfred/lib/python3.8/site-packages/gpt4all/pyllmodel.py", line 49, in load_llmodel_library llama_lib = ctypes.CDLL(llama_dir, mode=ctypes.RTLD_GLOBAL) File "/Users/rohittiwari/opt/anaconda3/lib/python3.8/ctypes/init.py", line 381, in init self._handle = _dlopen(self._name, mode) OSError: dlopen(/Users/rohittiwari/Downloads/alfredGPT/alfred/lib/python3.8/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllama.dylib, 10): Symbol not found: __ZNKSt3__115basic_stringbufIcNS_11char_traitsIcEENS_9allocatorIcEEE3strEv Referenced from: /Users/rohittiwari/Downloads/alfredGPT/alfred/lib/python3.8/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllama.dylib (which was built for Mac OS X 12.6) Expected in: /usr/lib/libc++.1.dylib

how to solve it?

knowrohit avatar May 24 '23 11:05 knowrohit

Got same issue

giacomo-domeniconi avatar May 25 '23 14:05 giacomo-domeniconi

I'm trying to follow the example in README and got a similar error:

(privategpt) sbslee@Seung-beens-Mac-mini privateGPT % python privateGPT.py
Using embedded DuckDB with persistence: data will be stored in: db
Traceback (most recent call last):
  File "/Users/sbslee/Desktop/privateGPT/privateGPT.py", line 76, in <module>
    main()
  File "/Users/sbslee/Desktop/privateGPT/privateGPT.py", line 36, in main
    llm = GPT4All(model=model_path, n_ctx=model_n_ctx, backend='gptj', callbacks=callbacks, verbose=False)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "pydantic/main.py", line 339, in pydantic.main.BaseModel.__init__
  File "pydantic/main.py", line 1102, in pydantic.main.validate_model
  File "/Users/sbslee/opt/anaconda3/envs/privategpt/lib/python3.11/site-packages/langchain/llms/gpt4all.py", line 133, in validate_environment
    from gpt4all import GPT4All as GPT4AllModel
  File "/Users/sbslee/opt/anaconda3/envs/privategpt/lib/python3.11/site-packages/gpt4all/__init__.py", line 1, in <module>
    from .pyllmodel import LLModel # noqa
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sbslee/opt/anaconda3/envs/privategpt/lib/python3.11/site-packages/gpt4all/pyllmodel.py", line 55, in <module>
    llmodel, llama = load_llmodel_library()
                     ^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sbslee/opt/anaconda3/envs/privategpt/lib/python3.11/site-packages/gpt4all/pyllmodel.py", line 49, in load_llmodel_library
    llama_lib = ctypes.CDLL(llama_dir, mode=ctypes.RTLD_GLOBAL)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/sbslee/opt/anaconda3/envs/privategpt/lib/python3.11/ctypes/__init__.py", line 376, in __init__
    self._handle = _dlopen(self._name, mode)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^
OSError: dlopen(/Users/sbslee/opt/anaconda3/envs/privategpt/lib/python3.11/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllama.dylib, 10): Symbol not found: __ZNKSt3__115basic_stringbufIcNS_11char_traitsIcEENS_9allocatorIcEEE3strEv
  Referenced from: /Users/sbslee/opt/anaconda3/envs/privategpt/lib/python3.11/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllama.dylib (which was built for Mac OS X 12.6)
  Expected in: /usr/lib/libc++.1.dylib

(privategpt) sbslee@Seung-beens-Mac-mini privateGPT % 

My computers's specs:

  • macOS Big Sur
  • Version 11.2.1
  • Mac mini (m1, 2020)
  • Chip Apple M1
  • Memory 8 GB

sbslee avatar May 26 '23 01:05 sbslee

Got same issue on macOS Big Sur v11.7.5

richardchen85 avatar May 26 '23 03:05 richardchen85

Got same issue on macOS Big Sur 11.3.1

wenziheng777 avatar May 26 '23 13:05 wenziheng777

the match language feature requires python >= 3.10. Check python3 --version, if its < 3.10.0 then that is the problem.

kris-watts-gravwell avatar May 26 '23 16:05 kris-watts-gravwell

When i ran on with python < 13.10 , got following exception,

File "privateGPT.py", line 31 match model_type: ^ SyntaxError: invalid syntax

Later , upgraded to 13.10 , i see the following

OSError: dlopen(/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllama.dylib, 10): Symbol not found: __ZNKSt3__115basic_stringbufIcNS_11char_traitsIcEENS_9allocatorIcEEE3strEv Referenced from: /Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllama.dylib (which was built for Mac OS X 12.6) Expected in: /usr/lib/libc++.1.dylib

Using macOS BigSur 11.7.6 (20G1231)

reddy279 avatar May 27 '23 13:05 reddy279

my python version is 3.11.2. got exception.

OSError: dlopen(opt/anaconda3/envs/privateGPT/lib/python3.11/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllama.dylib, 10): Symbol not found: __ZNKSt3__115basic_stringbufIcNS_11char_traitsIcEENS_9allocatorIcEEE3strEv Referenced from: opt/anaconda3/envs/privateGPT/lib/python3.11/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllama.dylib (which was built for Mac OS X 12.6) Expected in: /usr/lib/libc++.1.dylib

wenziheng777 avatar May 29 '23 06:05 wenziheng777

I found the problem. If you look at my error message above, it included which was built for Mac OS X 12.6. When I updated my Mac OS, the problem was resolved. Hope this helps.

sbslee avatar May 30 '23 06:05 sbslee

my python version is 3.11.2. got exception.

OSError: dlopen(opt/anaconda3/envs/privateGPT/lib/python3.11/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllama.dylib, 10): Symbol not found: __ZNKSt3__115basic_stringbufIcNS_11char_traitsIcEENS_9allocatorIcEEE3strEv Referenced from: opt/anaconda3/envs/privateGPT/lib/python3.11/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllama.dylib (which was built for Mac OS X 12.6) Expected in: /usr/lib/libc++.1.dylib

I fix this when I updated my Mac OS to version 13.4

wenziheng777 avatar May 30 '23 10:05 wenziheng777

Few steps to remember

  1. brew install gcc
  2. xcode-select --install
  3. xcode installed as well lmao
  4. python 3.9+
  5. pip install wheel (optional)

knowrohit avatar May 30 '23 11:05 knowrohit

@knowrohit getting same error , using MAC os 11.7.6 big spur. How to resolve this ?

Rehman-Akram avatar Jan 28 '24 17:01 Rehman-Akram