Ollama custom URL with basic auth
Is your feature request related to a problem? Please describe.
I would like to use a custom URL for my ollama instance from my server : https://ollama.example.com
but it uses a basic auth token for the authentication process
VSCode extension such as continue.dev uses the same configuration : https://github.com/continuedev/continue/issues/834#issuecomment-1953750202
"models": [ {
"title": "TinyLLAMA",
"provider": "ollama",
"model": "tinyllama",
"apiBase": "https://ollama.example.com",
"requestOptions": {
"headers": {"Authorization": "Basic bXl1c2VybmFtZTpteXBhc3N3b3Jk"}
}
}]
Describe the solution you'd like I think adding an additional option to configure the fetch function will be very helpful
Describe alternatives you've considered none
Additional context none
This one seems related and a prereq https://github.com/langchain-ai/langchainjs/issues/6631
Seconding this request, https://github.com/langchain-ai/langchainjs/issues/6631 seems to be merged, unblocking this issue