Open-Interface icon indicating copy to clipboard operation
Open-Interface copied to clipboard

Truly can't make it connect to Ollama

Open TremendaCarucha opened this issue 10 months ago • 4 comments

Describe the bug I tried all the suggestions in issues here. Using /v1, using /api/generate, both together. It just fails to connect and there's no info. I see no log in Ollama either. My humble opinion, you took all this effort to build this, it's probably relatively quickly to give it better Ollama support. There are a ton of people using it and it opens a world of opportunities. Hopefully you can make it work well easily

Checks

  • Did you restart the app after editing settings? Yes
  • Are you on the latest version of the app? Yes
  • In case of poor performance, did you try guiding the LLM in the "Custom LLM Instructions" section?

Environment

  • Version: 22
  • OS: Ubuntu
  • LLM: Llava 7b via Ollama (doesn't matter)

TremendaCarucha avatar Feb 03 '25 01:02 TremendaCarucha

I was getting the same issue, and what got it to connect for me was entering "xxx" into the API Key field

HornMichaelS avatar Feb 05 '25 17:02 HornMichaelS

That's a well made point. Let me add dedicated support for some usual Ollama models soon. If you've worked extensively with them please feel free to open a PR as well!

In the meantime, thank you for your fix @HornMichaelS - I've updated the README to mention it more clearly now.

AmberSahdev avatar Feb 06 '25 01:02 AmberSahdev

Hello @AmberSahdev @HornMichaelS @TremendaCarucha ,

Can you please specify the exact URL endpoint for Ollama? Can't figure this out and getting 404.

  • I have tested the following endpoint succesfully https://localhost:11434/api/generate (see below)
# call
curl -X POST http://localhost:11434/api/generate -H "Content-Type: application/json" -d '{"prompt": "Hello, Ollama! How are you today?", "model":"llama3.1"}'

# answer
{"model":"llama3.1","created_at":"2025-03-05T19:57:39.203790878Z","response":"Nice","done":false}
{"model":"llama3.1","created_at":"2025-03-05T19:57:39.371888161Z","response":" to","done":false}
{"model":"llama3.1","created_at":"2025-03-05T19:57:39.536946935Z","response":" meet","done":false}
{"model":"llama3.1","created_at":"2025-03-05T19:57:39.70213797Z","response":" you","done":false}
{"model":"llama3.1","created_at":"2025-03-05T19:57:39.872557526Z","response":"!","done":false}
...
  • but I keep getting a 404. Please see screenshot attached, can't figure it out.

Image

pnmartinez avatar Mar 05 '25 20:03 pnmartinez

I'm ok with Ollama despite Open Interface believe I'm on MacOS (then No actions are made)

Ollama endpoint for OpenAI is like below: (working for me)

Image

lemassykoi avatar Apr 17 '25 08:04 lemassykoi