localGPT
localGPT copied to clipboard
Chat with your documents on your local device using GPT models. No data leaves your device and 100% private.
_Originally posted by @sime2408 in https://github.com/PromtEngineer/localGPT/issues/151#issuecomment-1597633918_ This ticket is to support different methods of Document splitting. Specifically for different programming languages. Currently, Documents are loaded and then split with vanilla...
Hi, i am running UI on CPU, but i have a error when i did follow the construtors. Error: No files were found inside SOURCE_DOCUMENTS, please put a starter file...
Hi, on Windows11 the ingest.py fails. I got the following error ``` traceback (most recent call last): File "ingest.py", line 44, in def load_documents(source_dir: str) -> list[Document]: TypeError: 'type' object...
when attempting `python ingest.py` I get the error `is:issue TypeError: issubclass() arg 1 must be a class ` ``` File "[...]/anaconda3/lib/python3.8/typing.py", line 774, in __subclasscheck__ return issubclass(cls, self.__origin__) ```
Hi I have a set of different documents all of similar nature . Can I ask a question and ask the model to restrict itself to just one document ?...
My files have nothing to do with this information, yet I got an answer. I shouldn't have got an answer. 
I have Amazon EC2 machine having 32gb RAM and Test T4 GPU. When trying to run the script in CPU as well as enabling GPU, it is able to download...
Wed Jun 14 20:11:53 2023 +-----------------------------------------------------------------------------+ | NVIDIA-SMI 470.182.03 Driver Version: 470.182.03 CUDA Version: 11.4 | |-------------------------------+----------------------+----------------------+ | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan...
To use MPS, you have to install llama-cpp-python with these env vars: ```bash export CMAKE_ARGS="-DLLAMA_METAL=on" export FORCE_CMAKE=1 ``` otherwise run this to upgrade from current: ```bash CMAKE_ARGS="-DLLAMA_METAL=on" FORCE_CMAKE=1 pip install...
I use Anaconda to setup. When run ingest.py, occurs the following error: PS D:\privateGPT-main> & D:/Users/okokook/anaconda3/envs/python311/python.exe d:/localGPT-main/ingest.py 2023-06-30 20:31:25,891 - INFO - ingest.py:120 - Loading documents from D:\localGPT-main/SOURCE_DOCUMENTS 2023-06-30 20:31:29,854...