localGPT
localGPT copied to clipboard
Chat with your documents on your local device using GPT models. No data leaves your device and 100% private.
RuntimeError: CUDA error: out of memory CUDA kernel errors might be asynchronously reported at some other API call, so the stacktrace below might be incorrect. For debugging consider passing CUDA_LAUNCH_BLOCKING=1....
When i run the ingest.py file following long error occurs:     
As per your request here the error I get after installing localGPT PS C:\localGPT> python ingest.py C:\Users\Name\AppData\Local\Programs\Python\Python310\lib\site-packages\numpy\_distributor_init.py:30: UserWarning: loaded more than 1 DLL from .libs: C:\Users\Name\AppData\Local\Programs\Python\Python310\lib\site-packages\numpy\.libs\libopenblas.FB5AE2TYXYH2IJRDKGDGQ3XBKLKTF43H.gfortran-win_amd64.dll C:\Users\Name\AppData\Local\Programs\Python\Python310\lib\site-packages\numpy\.libs\libopenblas64__v0.3.21-gcc_10_3_0.dll warnings.warn("loaded more than...
[localGPT] main*% [2d,16h,12m] → $ python3 ingest.py --device_type cpu Loading documents from /home/ni-user/Desktop/localGPT/SOURCE_DOCUMENTS Loaded 1 documents from /home/ni-user/Desktop/localGPT/SOURCE_DOCUMENTS Split into 148 chunks of text load INSTRUCTOR_Transformer Killed [localGPT] main*% [2d,16h,12m]...
This pull request adds the ability to use XLSX files as a document source. This feature allows users to import data from XLSX files into the application, providing more flexibility...
Please read the Read.me file. You need to implement changes mentioned on ingest.py, run_localGPT.py, and instructor.py. instructor.py is inside the InstructorEmbeddings. The "instructor.py" is probably embeded similar to this: file_path...
## Actions taken: Ran the command python run_localGPT.py --device_type cpu Ingest.py --device_type cpu was ran before this with no issues. ## Expected result: For the "> Enter a query:" prompt...
How can we save the embeddings of a particular file and use it later for question-answering?