gpt4all icon indicating copy to clipboard operation
gpt4all copied to clipboard

Integration with Ollama

Open ceramicv opened this issue 7 months ago • 4 comments

I already have many models downloaded for use with locally installed Ollama.

As my Ollama server is always running is there a way to get GPT4All to use models being served up via Ollama, or can I point to where Ollama houses those already downloaded LLMs and have GPT4All use thos without having to download new models specifically for GPT4All?

ceramicv avatar Jul 03 '24 20:07 ceramicv