mods
mods copied to clipboard
Feature request: Ollama as an API option
First of all, thanks for the truly excellent tool! mods has become an indispensable part of my life on the CLI.
This is a humble feature request:
Provide an option to use ollama as an API, as in the following example:
mods --api ollama --model llama2:latest "Why is the sky blue?"
Ollama provides a REST API, see their docs here.
An example API call:
curl http://localhost:11434/api/generate -d '{
"model": "llama2",
"prompt":"Why is the sky blue?"
}'
Following, would be nice to have it built-in.
Meanwhile, it has been suggested at an ollama issue that the litellm proxy may be used as a translation between opanai format and ollama, e.g. with
litellm --model ollama/llama2 \
--api_base http://localhost:11434 \
--host "127.0.0.1" --telemetry False \
--temperature 0.3 --max_tokens 2048
with mods --settings yml's e.g. localai section replaced with
base-url: http://localhost:8000
api-key: "IGNORED"
models:
llama2:
aliases: ["local"]
max-input-chars: 12250
fallback:
Wrapped up Ollama/litellm/mods config together in one script. See: https://github.com/yeahdongcn/MacAI/blob/main/start.sh
https://github.com/ollama/ollama/releases/tag/v0.1.24 ollama have added OpenAI Compatibility
Ollama works now that it has the OpenAI api compatibility.
Example:
ollama:
base-url: http://localhost:11434/v1
api-key-env: NA
models:
"codellama:7b":
max-input-chars: 4000
then mods -m codellama:7b works
The next version of ollama will support their messaging API. As well as support for anthropic based models.
ollama support was added on main
will close, please reopen if needed.