local_llama icon indicating copy to clipboard operation
local_llama copied to clipboard

This repo is to showcase how you can run a model locally and offline, free of OpenAI dependencies.

Results 9 local_llama issues
Sort by recently updated
recently updated
newest added

Hello, trying to setup the project, running into issues. ## Previous issues that seem related Maybe this should be an issue for the upstream library itself. Maybe these are related?...

Hi, First, this is a great project. I love it! I tried to run the v3 as I installed a few LLMs with ollama (which works fine). But I keep...

streamlit.errors.StreamlitAPIException: `set_page_config()` can only be called once per app page, and must be called as the first Streamlit command in your script. is thrown after following instructions and filling the...

This is an awsome project. I pull the code and get it up running quickly. Do you have any idea how to improve the query results from my uploaded documents?...

Whenever I submit a prompt after attaching a pdf file I get this error FileNotFoundError: [Errno 2] No such file or directory: 'C:/Users/avashish/GPT_INDEXES/None/docstore.json' Traceback: File "C:\Users\avashish\AppData\Local\anaconda3\lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 552, in _run_script...

Hello, It is a wonderful work done here! I would appreciated any guidance on how to run local-llama with Multi-GPU. Thank you.

Hi! thank you very much! when I try to download a pdf, I get an error. can you please tell me what can be done here? ``` FileNotFoundError: [Errno 2]...

I tried the following models: ``` MODEL_NAME = 'ggml-vicuna-7b-q4_0.bin' MODEL_PATH = r"D:\\ggml-vicuna-7b-q4_0.bin" MODEL_NAME = 'GPT4All-13B-snoozy.ggmlv3.q4_1.bin' MODEL_PATH = r"D:\\GPT4All-13B-snoozy.ggmlv3.q4_1.bin" MODEL_NAME = 'ggml-old-vic7b-q4_0.bin' MODEL_PATH = r"C:\\Users\\elnuevo\\Downloads\\ggml-old-vic7b-q4_0.bin" ``` But only the GPT4All models...

so i did a fresh install (pip install -r requirements.txt) in conda and stumbled across this error As you might see in my profile i do not open issues that...