private-gpt
private-gpt copied to clipboard
PrivateGPT.py is failing to retrieve the model
After Running ingest.py, I was running PrivateGPT.py and got the below error
Using embedded DuckDB with persistence: data will be stored in: db
Traceback (most recent call last):
File "/content/privateGPT/privateGPT.py", line 76, in
Expected behavior it should run properly and ask for my query input
@VimalMathew99, Please share your folder structure and contents in .env file
PERSIST_DIRECTORY=db MODEL_TYPE=GPT4All MODEL_PATH=/ggml-gpt4all-j-v1.3-groovy.bin EMBEDDINGS_MODEL_NAME=all-MiniLM-L6-v2 MODEL_N_CTX=1000 TARGET_SOURCE_CHUNKS=4
I have kept the Model file in the same directory so I have given "/file_name" as the model path.
i think you should just remove the trailing slash / before your model path
MODEL_PATH=ggml-gpt4all-j-v1.3-groovy.bin
done still same error
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. google-colab 1.0.0 requires requests==2.27.1, but you have requests 2.31.0 which is incompatible.
will the above come due to this error?
??
I had exact same issue. i found out that "ggml-gpt4all-j-v1.3-groovy.bin" was not in the directory were i launched python ingest.py downloading the bin again solved the issue
Content of env file:
PERSIST_DIRECTORY=db MODEL_TYPE=GPT4All MODEL_PATH=ggml-gpt4all-j-v1.3-groovy.bin EMBEDDINGS_MODEL_NAME=all-MiniLM-L6-v2 MODEL_N_CTX=1000 TARGET_SOURCE_CHUNKS=4
still getting the same erro. Also I changed example.env to .env
The suggestion (i.e. placing the model within a model directory inside the privateGPT folder) in this issue thread worked for me: https://github.com/imartinez/privateGPT/issues/621
The suggestion (i.e. placing the model within a model directory inside the privateGPT folder) in this issue thread worked for me: #621
This solution also worked for me.