Andrea Pinto
Andrea Pinto
@imartinez did you have the chance to review the PR?
You need to provide the absolute path to your `LLAMA` model in the `.env` file. See #68 for details.
@krstrid Are you sure you're `.env` file is located at the root of the project? Did you add any spacing after or before the `=` when defining your env variables?...
@krstrid You're missing the `dotenv` library, it is not in the `requirements.txt` for some reason. Try running: ``` # execute at project root pip install -r requirements.txt pip install python-dotenv...
@krstrid can you try modifying `constants.py`, then running `ingest` and share the trace again. ```python # modifications in `constants.py` file: L8: PERSIST_DIRECTORY = os.environ.get('PERSIST_DIRECTORY') L9: print(PERSIST_DIRECTORY) # NEW! $ python...
@krstrid Try the following and see what happens: ``` python3 -m pip install -r requirements.txt python3 -m pip install python-dotenv python3 ingest.py python3 privateGPT.py ```
Can you provide more details on your error?
If you're using `gpt4all` and `llama` embeddings, you should be able to ingest all of your documents in Spanish. You should also be able to ask your queries in Spanish...
I am not sure the good way of doing this is to duplicate the content of the `ingest.py` file and add a single extra line. Plus, in the current proposal,...
@maozdemir Yeah. This is just a(n optional) matter of readability.