continue
continue copied to clipboard
Use with basic auth
Validations
- [X] I believe this is a way to improve. I'll try to join the Continue Discord for questions
- [X] I'm not able to find an open issue that requests the same enhancement
Problem
I have a local ollama instance behind a reverse proxy with SSL and basic auth. I can query it with curl:
$ curl https://user:[email protected]
Ollama is running
But I can't use Continue with it, I tried to use as a model:
{
"title": "TinyLLAMA",
"provider": "ollama",
"model": "tinyllama",
"apiBase": "https://user:[email protected]"
},
and got the error
Continue error: https://user:[email protected]/api/chat is an url with embedded credentials.
Solution
Implement a way to pass basic auth credentials (or if it is already possible, document it)
Nice and easy way to protect our endpoint, please implement this or apiKey in HTTP header.
@radhewro @d1nuc0m will work on improving the documentation, but to answer your question here one way of doing it is with requestOptions.headers like this:
{
"models": [{
"title": "My Model",
...
"requestOptions": {
"headers": {"Authorization": "Bearer xxx"}
}
}]
}
If you do "Authorization": "Basic xxx" and leave your apiBase as just ollama.example.com, this should be equivalent to the user:[email protected].
We can parse something like user:[email protected] to convert to the headers format if this is important to you
@sestinj Thank you, it works
As an example for who'll find the issue - with tinyllama, this is my model setup:
"models": [ {
"title": "TinyLLAMA",
"provider": "ollama",
"model": "tinyllama",
"apiBase": "https://ollama.example.com",
"requestOptions": {
"headers": {"Authorization": "Basic bXl1c2VybmFtZTpteXBhc3N3b3Jk"}
}
}]
}
Where bXl1c2VybmFtZTpteXBhc3N3b3Jk is the output of echo -n "myusername:mypassword" | base64.
Using only ollama.example.com as apiBase returns Continue error: Invalid URL.
doesn't seem to be working anymore, my setup works when I try with postman but not with continuedev:
Error: HTTP 401 Unauthorized from https://ollama.llm.domains.leaf.cloud/api/chat <html> <head><title>401 Authorization Required</title></head> <body> <center><h1>401 Authorization Required</h1></center> <hr><center>nginx</center> </body> </html>
My setup for config.json is:
{
"models": [
{
"title": "llama3",
"provider": "ollama",
"model": "llama3:instruct",
"apiBase": "https://llmoll.mycloud.cloud",
"requestOptions": {
"headers": {
"Authorization":"Basic lkjhlkjhlkjh"
}
}
}
],
"customCommands": [
{
"name": "test",
"prompt": "{{{ input }}}\n\nWrite a comprehensive set of unit tests for the selected code. It should setup, run tests that check for correctness including important edge cases, and teardown. Ensure that the tests are complete and sophisticated. Give the tests just as chat output, don't edit any file.",
"description": "Write unit tests for highlighted code"
}
],
"tabAutocompleteModel": {
"title": "llama3",
"provider": "ollama",
"model": "llama3:instruct",
"apiBase": "https://llmoll.mycloud.cloud",
"requestOptions": {
"headers": {
"Authorization":"Basic jhgkhkjhg"
}
}
},
"allowAnonymousTelemetry": true
}
Looking at https://github.com/continuedev/continue/blob/main/core/llm/llms/Ollama.ts It seems like the headers from the requestoptions will never be picked up
Just a little FYI for anyone in a similar situation, and using nginx or nginx-proxy-manager:
As the Ollama.ts code seems to use as apiKey option, but sets it as Bearer Auth-Token i just added a little custom snippet in my nginx-proxy-manager config to handle the authentication, instead of the regular (Basic) Auth config, that does not work:
continue config.json
"apiBase": "https://ollama....",
"apiKey": "THETOKEN"
nginx-proxy manager advanced host config:
set $auth_header $http_authorization;
if ($auth_header !~* "^Bearer THETOKEN$") {
return 403;
}