open-interpreter icon indicating copy to clipboard operation
open-interpreter copied to clipboard

--local should respect $OLLAMA_HOST rather than defaulting to localhost

Open sammcj opened this issue 1 year ago • 2 comments

Is your feature request related to a problem? Please describe.

No response

Describe the solution you'd like

$OLLAMA_HOST is the standard way to define your Ollama server URL. It is used by various Ollama clients including the official Ollama command line tool.

At present --local uses a hardcoded api_base of http://localhost:11434

It would be nice if it defaulted to the $OLLAMA_HOST environment variable value (if found).

Describe alternatives you've considered

No response

Additional context

No response

sammcj avatar Aug 20 '24 10:08 sammcj

100% agree. Its very unusable when its restricted/limited to localhost. I guess a lot of people run ollama on a system with GPUs thats not their actual localhost. So please add that this works. I tried even with --api_base parameter but run into errors (llm.py and lightllm).

suizideFloat avatar Aug 20 '24 15:08 suizideFloat

+10 to this one - in fact I run Ollama on 4 PCs... with the heavy models on the ones with better GPUs. It would be great if the client could understand this and route LLM traffic efficiently.

mdlmarkham avatar Sep 12 '24 17:09 mdlmarkham