Chris Van Pelt (CVP)
Chris Van Pelt (CVP)
@vale46n1 @MyndPhreak that sounds like a bigger undertaking. For multi-file / project use cases I suggest: 1. I've been using [Cursor](https://cursor.sh/) and it's awesome. You can open a chat window...
If you set `OPENUI_HOST=https://myserver.com` in your `docker-compose.yaml`, that should fix it! The `ollama/tags` will error as long as you don't expose a an ollama service to the container and that's...
BTW I just [added a docker-compose ](https://github.com/wandb/openui?tab=readme-ov-file#docker-compose)to the repo that shows how to expose Ollama should you want to.
Sorry about that, I see what's happening. The problem line in the codebase [is here](https://github.com/wandb/openui/blob/main/frontend/src/components/HtmlAnnotator.tsx#L126). You'll need to build the frontend in hosted mode before building the container. You can...
Merged, tested and adding a fix to master now!
Would like to hear more about your usecase. If you want to mess around locally, you'd just change [this line](https://github.com/wandb/openui/blob/main/backend/openui/server.py#L68). That's still going to pass `gpt-3.5-turbo` etc as a model...
@yxl23 can you share some more details about what you're imagining? This repo generated HTML so any QT interface would just be a wrapper around some kind of web view....
@Verfinix I'm gonna need more information about how you're running the service and what you're seeing. Can you add a screenshot and provide info about how you're running OpenUI?
Hey @amer-spirinova are you running the service on localhost / 127.0.0.1? The middle section needs to be able to talk to http://127.0.0.1:7878 by default. If you're running it on a...
The chat history will show what the model output. Because of the way codellama was trained you might need to alter the system prompt. The error means it isn't returning...