continue
continue copied to clipboard
No advice on connecting to Ollama behind Open WebUI
Before submitting your bug report
- [X] I believe this is a bug. I'll try to join the Continue Discord for questions
- [X] I'm not able to find an open issue that reports the same bug
- [X] I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS:N/A
- Continue version: v0.8.52
- IDE version: 1.93.1
- Model: N/A
- config.json: N/A
Description
I see there is a cloded issue / bug explaining how to connect to an internally hosted Ollama listening on an IP on port 11434. But how do you connect to an Ollama which is only served behind the popular Open WebUI (443/tcp TLS)?
To reproduce
No response
Log output
No response
Hi @huornlmj , have you tried to set the apiBase for the model?
If you get this running it could be a nice Tutorial to add to the docs site if you're interested in contributing!
Hi @huornlmj, here's a working config file for Open WebUI.
{
"models": [
{
"title": "Ollama Remote",
"provider": "ollama",
"model": "llama3:instruct",
"completionOptions": {},
"contextLength": 8192,
"apiBase": "https://<your Open WebUI URL>/ollama",
"apiKey" :"<your API key>",
}
],
"customCommands": [
{
"name": "test",
"prompt": "{{{ input }}}\n\nWrite a comprehensive set of unit tests for the selected code. It should setup, run tests that check for correctness including important edge cases, and teardown. Ensure that the tests are complete and sophisticated. Give the tests just as chat output, don't edit any file.",
"description": "Write unit tests for highlighted code"
}
],
"tabAutocompleteModel": {
"title": "Ollama Remote Autocomplete",
"provider": "ollama",
"model": "codellama:code",
"completionOptions": {},
"contextLength": 8192,
"apiBase": "https://<your Open WebUI URL>/ollama",
"apiKey" :"<your API key>",
},
"allowAnonymousTelemetry": false,
"embeddingsProvider": {
"provider": "transformers.js"
},
"docs": []
}
Thanks for sharing @0xThresh !
Same problem but the config doesn't work. The tabAutocompleteModel and some functions like writing a docstring work well. But everything needs to use the chat panel, which will always give me an empty response. Here is my config.
"models": [
{
"model": "qwen2.5-coder:7b",
"provider": "ollama",
"completionOptions": {},
"contextLength": 8192,
"apiBase": "https://****/ollama",
"apiKey": "*****",
"title": "Qwen2.5 Coder 7B"
}
],
"tabAutocompleteModel": {
"model": "qwen2.5-coder:7b",
"provider": "ollama",
"completionOptions": {},
"contextLength": 8192,
"apiBase": "https://****/ollama",
"apiKey": "****",
"title": "Qwen2.5 Coder 7B"
}
For example, when I want to explain my code.
@xqe2011 this actually just started happening to me as well, and I noticed it right after an Open WebUI upgrade. I went from around 0.3.10 to 0.3.32 and the issue appeared right away. I'll bring the issue up with the Open WebUI folks, it's helpful to know it's not just me.
Can you tell me what version of Open WebUI you're on?
UPDATE: The maintainer just told me someone submitted a PR that should fix this on 0.3.35, so I'll be upgrading to that tomorrow, I'll report back.
@0xThresh
My Open WebUI version: v0.3.33
Ollama version: 0.3.14
continue.dev version: v0.8.55
UPDATE: I tried to upgrade Open WebUI to v0.3.35, and the problem was solved.
could you share you config that works now ? thank you
This issue hasn't been updated in 90 days and will be closed after an additional 10 days without activity. If it's still important, please leave a comment and share any new information that would help us address the issue.
This issue was closed because it wasn't updated for 10 days after being marked stale. If it's still important, please reopen + comment and we'll gladly take another look!