continue
continue copied to clipboard
config.json parameter "apiKey" does not be appear to be used with ollama provider
Before submitting your bug report
- [X] I believe this is a bug. I'll try to join the Continue Discord for questions
- [X] I'm not able to find an open issue that reports the same bug
- [X] I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS: Windows 11
- Continue: v0.8.27
- IDE: VS Code
Description
We have Nginx doing SSL termination for ollama and forwarding to ollama_proxy_server for authentication, then to ollama. When trying to use Continue in VS Code with the following config.json block:
{
"title": "llama3-instruct",
"provider": "ollama",
"model": "llama3:instruct",
"completionOptions": {},
"apiBase": "https://<my_endpoint>",
"apiKey": "user1:<my_key>"
},
I get HTTP 403 Unauthorized error from Continue plugin. When analyzing the Nginx logs I see the requests coming in, but lacking any Bearer tokens:
Nginx logging config:
log_format headers '[$time_local] $remote_addr - $remote_user - $server_name to: $upstream_addr: $request - $http_authorization';
To reproduce
No response
Log output
No response
@troy256 we hadn't been using apiKey for Ollama because it doesn't have built-in auth, but I can see why it would make sense to do that now: https://github.com/continuedev/continue/commit/034754c5f44a25db5a61b6c7ca5a3e8434686f80
The fix will be in the next release, but in the meantime you can use "requestOptions.headers" in config.json like this:
"models": [{
"title": "Ollama",
...
"requestOptions": { "headers": { "Authorization": "Bearer xxx" } }
}]