mods icon indicating copy to clipboard operation
mods copied to clipboard

Feature request: Ollama as an API option

Open qjcg opened this issue 2 years ago • 4 comments

First of all, thanks for the truly excellent tool! mods has become an indispensable part of my life on the CLI.

This is a humble feature request:

Provide an option to use ollama as an API, as in the following example:

mods --api ollama --model llama2:latest "Why is the sky blue?"

Ollama provides a REST API, see their docs here.

An example API call:

curl http://localhost:11434/api/generate -d '{
  "model": "llama2",
  "prompt":"Why is the sky blue?"
}'

qjcg avatar Nov 21 '23 10:11 qjcg

Following, would be nice to have it built-in. Meanwhile, it has been suggested at an ollama issue that the litellm proxy may be used as a translation between opanai format and ollama, e.g. with

litellm --model ollama/llama2 \
    --api_base http://localhost:11434 \
    --host "127.0.0.1" --telemetry False \
    --temperature 0.3 --max_tokens 2048

with mods --settings yml's e.g. localai section replaced with

base-url: http://localhost:8000
    api-key: "IGNORED"
    models:
      llama2:
        aliases: ["local"]
        max-input-chars: 12250
        fallback:

daviehh avatar Jan 07 '24 00:01 daviehh

Wrapped up Ollama/litellm/mods config together in one script. See: https://github.com/yeahdongcn/MacAI/blob/main/start.sh

yeahdongcn avatar Jan 17 '24 09:01 yeahdongcn

https://github.com/ollama/ollama/releases/tag/v0.1.24 ollama have added OpenAI Compatibility

jmbiaudis avatar Feb 14 '24 20:02 jmbiaudis

Ollama works now that it has the OpenAI api compatibility.

Example:

  ollama:
    base-url: http://localhost:11434/v1
    api-key-env: NA
    models:
      "codellama:7b":
        max-input-chars: 4000

then mods -m codellama:7b works

garyblankenship avatar Mar 19 '24 03:03 garyblankenship

The next version of ollama will support their messaging API. As well as support for anthropic based models.

cloudbridgeuy avatar May 24 '24 23:05 cloudbridgeuy

ollama support was added on main

will close, please reopen if needed.

caarlos0 avatar Jun 03 '24 17:06 caarlos0