opengpts icon indicating copy to clipboard operation
opengpts copied to clipboard

requesting to add ollama.

Open thesanju opened this issue 1 year ago • 3 comments

thesanju avatar Nov 11 '23 06:11 thesanju

we need prompting strategies that work with oss models reliably first

hwchase17 avatar Nov 11 '23 20:11 hwchase17

Yes, I'll be waiting for that feature, imagine GPTs running locally and doing things in background while you are working on your things.

thesanju avatar Nov 11 '23 21:11 thesanju

@thesanju, Ollama is now supported out-of-the-box. I just tested it with the latest code and it works as expected. Please give it a try.

If you're running Ollama from your local machine (http://localhost:11434), you'll need to ensure the backend Docker service has access to the Ollama API. Because localhost will reference the container itself, you'll need to use the special DNS name host.docker.internal to refer to your host machine. You can specify the environment variable OLLAMA_BASE_URL=http://host.docker.internal:11434 so that the backend service will point to the Ollama API running on the host machine.

andrewnguonly avatar Apr 09 '24 01:04 andrewnguonly