Need help on in some errors
File "F:\privateGPT\Lib\site-packages\langchain\embeddings\llamacpp.py", line 79, in validate_environment
values["client"] = Llama(
^^^^^^
File "F:\privateGPT\Lib\site-packages\llama_cpp\llama.py", line 155, in init
self.ctx = llama_cpp.llama_init_from_file(
^^^^^^^^^^^^^^^^^^^^^
File "F:\privateGPT\Lib\site-packages\llama_cpp\llama_cpp.py", line 182, in llama_init_from_file return _lib.llama_init_from_file(path_model, params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ OSError: [WinError -1073741795] Windows Error 0xc000001d During handling of the above exception, another exception occurred:
File "F:\privateGPT\ingest.py", line 62, in
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "pydantic\main.py", line 339, in pydantic.main.BaseModel.init
File "pydantic\main.py", line 1102, in pydantic.main.validate_model
File "F:\privateGPT\Lib\site-packages\langchain\embeddings\llamacpp.py", line 99, in validate_environment
raise NameError(f"Could not load Llama model from path: {model_path}")
NameError: Could not load Llama model from path: F:/privateGPT/models/ggml-model-q4_0.bin
Exception ignored in: <function Llama.del at 0x000002307F085E40>
Traceback (most recent call last):
File "F:\privateGPT\Lib\site-packages\llama_cpp\llama.py", line 978, in del
if self.ctx is not None:
^^^^
AttributeError: 'Llama' object has no attribute 'ctx'
I have a similar issue on my windows pc
Follow the instruction in the README.
Make sure to set the proper path to your model in your .env file.