private-gpt
private-gpt copied to clipboard
Version mismatch with gpt4all model?
When following the readme, including downloading the model from the URL provided, I run into this on ingest:
llama.cpp: loading model from models/ggml-model-q4_0.bin
error loading model: unknown (magic, version) combination: 6e756f46, 52202e64; is this really a GGML file?
Links used from readme:
https://gpt4all.io/models/ggml-gpt4all-j-v1.3-groovy.bin
https://huggingface.co/Pi3141/alpaca-native-7B-ggml/resolve/397e872bf4c83f4c642317a5bf65ce84a105786e/ggml-model-q4_0.bin
PrivateGPT commit: 5a695e9767e24778ffd725ab195bf72916e27ba5
Output of pip show langchain chromadb pygpt4all llama-cpp-python urllib3
:
Name: langchain
Version: 0.0.166
Summary: Building applications with LLMs through composability
Home-page: https://www.github.com/hwchase17/langchain
Author:
Author-email:
License: MIT
Location: /home/user/.local/lib/python3.11/site-packages
Requires: aiohttp, dataclasses-json, numexpr, numpy, openapi-schema-pydantic, pydantic, PyYAML, requests, SQLAlchemy, tenacity, tqdm
Required-by:
---
Name: chromadb
Version: 0.3.22
Summary: Chroma.
Home-page:
Author:
Author-email: Jeff Huber <[email protected]>, Anton Troynikov <[email protected]>
License:
Location: /home/user/.local/lib/python3.11/site-packages
Requires: clickhouse-connect, duckdb, fastapi, hnswlib, numpy, pandas, posthog, pydantic, requests, sentence-transformers, typing-extensions, uvicorn
Required-by:
---
Name: pygpt4all
Version: 1.1.0
Summary: Official Python CPU inference for GPT4All language models based on llama.cpp and ggml
Home-page:
Author: Abdeladim Sadiki
Author-email:
License: MIT
Location: /home/user/.local/lib/python3.11/site-packages
Requires: pygptj, pyllamacpp
Required-by:
---
Name: llama-cpp-python
Version: 0.1.48
Summary: A Python wrapper for llama.cpp
Home-page:
Author: Andrei Betlen
Author-email: [email protected]
License: MIT
Location: /home/user/.local/lib/python3.11/site-packages
Requires: typing-extensions
Required-by:
---
Name: urllib3
Version: 1.26.6
Summary: HTTP library with thread-safe connection pooling, file post, and more.
Home-page: https://urllib3.readthedocs.io/
Author: Andrey Petrov
Author-email: [email protected]
License: MIT
Location: /home/user/.local/lib/python3.11/site-packages
Requires:
Required-by: clickhouse-connect, requests
I yet have to figure out which file version to download, but wanted to alert about this.