big-AGI icon indicating copy to clipboard operation
big-AGI copied to clipboard

Local LLMs (self-hosting / desktop client)

Open pacoccino opened this issue 1 year ago • 4 comments

Why Local LLMs are raising, such as ollama, LMStudio etc. It's already possible to use them with big-agi, but only if they are accessible by the machine who hosts the big-agi server.

Description Being able to use local LLMs that runs on the client's machine accessing big-agi frontend. That'd allow people who doesn't self host to use their own local LLMs, as to hosters don't run LLMs for all the users.

Requirements The frontend have to call the local LLM, not the backend anymore

pacoccino avatar Feb 17 '24 11:02 pacoccino