openui icon indicating copy to clipboard operation
openui copied to clipboard

Ollama NOT working localy, codespace, docker

Open rony432 opened this issue 10 months ago • 8 comments

the main problem is that OPENAI package is TAKING priority over ollama. keep saying API key all projects that using openai official package have this issue with ollama.

https://github.com/wandb/openui/issues/18#issuecomment-2034503485

please add Ollama server option to force it in OpenUI setting. by default its not checking for ollama just says OPENAI KEY missing/wrong

rony432 avatar Apr 03 '24 18:04 rony432

Hmm, I just tried ollama in Codespaces and it's working for me, I see this in the settings dialog:

Screenshot 2024-04-03 at 12 15 10 PM

It sounds like what you're running into is the requirement for an OPENAI_API_KEY? I mentioned this in the README, but you can just set that to xxx, i.e. OPENAI_API_KEY=xxx python -m openui. Let me know if that doesn't work or you're having other issues with Ollama.

vanpelt avatar Apr 03 '24 19:04 vanpelt

Snag_2f24404a Snag_2f245b83

i cannot access Ollama either

BeeTwenty avatar Apr 04 '24 09:04 BeeTwenty

i have set it to xxx in the docker run command

BeeTwenty avatar Apr 04 '24 09:04 BeeTwenty

latest update working, thank you.(on codespace is very slow, as usual 👍 :)

inside terminal pull models for ollama first run this export OPENAI_API_KEY=xxx then python -m openui . in main folder

rony432 avatar Apr 04 '24 14:04 rony432

I did the local installations of Ollama with codellama model. I did docker setup, but it was not able to connect to Ollama server to fetch the list of models. I used venv and installed the packages locally, and ran the server locally.

I got the models list from Ollama. But, the completions api (/v1/chat/completions) failed with 500 Internal Server Error.

The Ollama package's ollama.chat(**data) failed with 404 error, it was pointing to /api/chat endpoint. Due to that I got runtime error RuntimeError: Attempted to call a sync iterator on an async stream in the response in the browser.

Did anyone faced this?

hirenchauhan2 avatar Apr 04 '24 14:04 hirenchauhan2

for docker get into container terminal and run this export OPENAI_API_KEY=xxx Kill ollma then re run ollama serve then try again.

i got 500 error when it could not find any OAI key. or taking too long ollama to respond i recommend running on gpu insted of cpu for faster response.

rony432 avatar Apr 04 '24 15:04 rony432

note for dev. increase reply timeout when ollama is active.

rony432 avatar Apr 04 '24 15:04 rony432

@hirenchauhan2 and @BeeTwenty those errors look like potentially a ollama compatibility issue. Can you verify you're running a fairly recent version of Ollama or share what version you have? You can get it by running:

ollama --version

vanpelt avatar Apr 04 '24 21:04 vanpelt

Hey guys, I just updated the README with instructions for running via Docker Compose. That might be easiest all be it slow.

vanpelt avatar Apr 05 '24 08:04 vanpelt

@vanpelt Yeah, it was compatibility issue. I was using older version of Ollama 0.1.1. I updated to newer version 0.1.30 and it started working. I don't have GPU on my machine, so it's slow which is understandable, but I keep getting the timeout error. Is this set in backend or frontend? I haven't checked the code yet hence asking this timeout thing. If we can remove the timeouts just for Ollama it would be better.

hirenchauhan2 avatar Apr 05 '24 08:04 hirenchauhan2