localGPT
localGPT copied to clipboard
Chat with your documents on your local device using GPT models. No data leaves your device and 100% private.
fix link in markdown
### Describe the bug: Running 'pip install -r requirements.txt resulted' in error regarding building wheel for llama-cpp-python ### Reproduction: pip install -r requirements.txt ERROR: Failed building wheel for llama-cpp-python ###...
### Error: ```bash ~/Documents/github_pascalandy/localGPT|main⚡ ⇒ python ingest.py Traceback (most recent call last): File "/Users/andy16/Documents/github_pascalandy/localGPT/ingest.py", line 4, in from utils import xlxs_to_csv File "/Users/andy16/Documents/github_pascalandy/localGPT/utils.py", line 1, in import openpyxl ModuleNotFoundError: No...
Run the command according to readme but encounter the error that cannot find module openpyxl. After pip install openpyxl the program run normally
ingest.py is not supporting multi cores.
This pull request addresses Issue #72 and includes the following changes: 1. Refactor document loading in `ingest.py` to utilize the `DOCUMENT_MAP` dictionary from `constants.py`. This improves modularity and allows for...
This isn't ideal: ```py @click.command() @click.option('--device_type', default='cuda', help='device to run on, select gpu, cpu or mps') def main(device_type, ): # load the instructorEmbeddings if device_type in ['cpu', 'CPU']: device='cpu' elif...
This issue occurs when running the run_localGPT.py file. I've tried both cpu and cuda devices, but still results in the same issue below when loading checkpoint shards. The warning itself...
gradio demo
would be great to have a gradio UI for this similar to https://github.com/oobabooga/text-generation-webui
I just ran those .py files as instructed in the README.md file. After that, a bunch of stuff was downloaded. It took up a ton of space in my system....