UI not rendering when running in Docker
PS C:\Users\phoed> docker run -p 7878:7878 -e OPENAI_API_KEY wandb/openui wandb: Unpatching OpenAI completions INFO (openui): Starting OpenUI AI Server created by W&B... INFO (openui): Running API Server INFO (uvicorn.error): Started server process [1] INFO (uvicorn.error): Waiting for application startup. DEBUG (openui): Starting up server in 1... INFO (uvicorn.error): Application startup complete. INFO (uvicorn.error): Uvicorn running on http://127.0.0.1:7878 (Press CTRL+C to quit)
browser return nothing, emply responce
Sorry about that, I'll update the instructions. You likely need to add -e OPENUI_ENVIRONMENT=production, without that the service in the container only listens on 127.0.0.1, but with it you should see http://0.0.0.0:7878 in the output.
I do have the same issue. Can you give more details about "You likely need to add -e OPENUI_ENVIRONMENT=production"
@lucesgabriel that tells the docker container to run in production mode so if you're getting the same issue when attempting to run docker, your docker command should instead be:
docker run -p 7878:7878 -e OPENAI_API_KEY -e OPENUI_ENVIRONMENT=production wandb/openui
If you're having an issue when not running via docker, let me know how you're running the application.
Dear @vanpelt yes, production mode helped to open appilcation. now I can't authorize in dockerized app:
- opening in browser http://127.0.0.1:7878 route to http://127.0.0.1:7878/ai/new
- application ask to log via Github
- login link like https://github.com/login/oauth/authorize?client_id=None&redirect_uri=http%3A%2F%2Flocalhost%3A7878%2Fv1%2Fcallback&response_type=code&scope=user%3Aemail show 404
- re-open http://127.0.0.1:7878 unfortunately still ask to login via Github... (
@phoedos, just fixed the ollama / login issue even when not using OpenAI. For ollama to work from within docker you would need the ollama docker container running on the same network, but it should work now. Just pull master and rebuild the container! Details about ollama in Codespaces here: https://github.com/wandb/openui?tab=readme-ov-file#ollama
dear @vanpelt image was rebuild but still no sucsess, but stacktrace is much better now PS OPENAI_API_KEY is defined and api requests bring a responce inside container https://platform.openai.com/docs/quickstart/?context=curl log stacktrace is: https://pastebin.com/fgD7Eenq
Thanks @phoedos, I found the issue. If you run the following the docker container should work now! I had forgotten setting the environment to production tried to enforce login. Let me know if this doesn't work, and thanks for the debugging info!
git pull origin main
cd backend
docker build . -t wandb/openui --load
docker run -p 7878:7878 -e OPENAI_API_KEY wandb/openui
Hi @vanpelt now is working for me. Thanks!
dear @vanpelt
still no luck on updated code
getting error like "ERROR (openui): Server Error: All connection attempts failed" durin login attempt
stacktrace is https://pastebin.com/Du00zbmA
@phoedos make sure you're not setting OPENUI_ENVIRONMENT=production that should be left out when start the container. I'm not sure what you mean by "login attempt" though.
What behavior are you seeing? The errors in the pastebin are related to not being able to list Ollama models which can be safely ignored.
dear @vanpelt
yep, my fault. The issue is completely resolved.
Awesome! I also added a Docker Compose section to the readme that includes Ollama.