[Feature]: Complete support for api key as a param to all ollama end points
The Feature
It would be great the existing support for api keys existing in ollama_chat completion calls is extended to cover other calls, specifically those in ollama (not ollama_chat) (embeddings and others).
That way it can be easily delegated to reverse proxy or similar to proceed with the Authorization by validating the bearer token. Right now only completions are covered with the feature.
Link to the original implementation in ollama_chat: https://github.com/BerriAI/litellm/commit/3c6b6355c7ffaad28fe8aab3e39f8e380fd5266b
Ciao :-)
PS: Off-topic, the current ollama / ollama_chat duality is something that, when viewed from outside, is shocking, and I imagine that leads to all sort of missing stuff. I've seen that there are some issues about trying to unify both.
Motivation, pitch
To be able to validate ALL the calls to Ollama based on auth + bearer token (api-key).
Twitter / LinkedIn details
No response
PS: Off-topic, the current ollama / ollama_chat duality is something that, when viewed from outside, is shocking, and I imagine that leads to all sort of missing stuff. I've seen that there are some issues about trying to unify both.
@stronk7 we have a ticket on this, would welcome a PR to migrate to ollama_chat consistently - https://github.com/BerriAI/litellm/issues/5048
@krrishdholakia would you be open to making a fix before #5048 is implemented?
I believe that if you change the logic in the calls to ollama_async_streaming around https://github.com/BerriAI/litellm/blob/8fa2cf15eee0a8b8c062960edae2aa3716b29091/litellm/llms/ollama.py#L233 to match https://github.com/BerriAI/litellm/blob/8fa2cf15eee0a8b8c062960edae2aa3716b29091/litellm/llms/ollama_chat.py#L248 by passing the api_key and then build the request (_request in ollama_chat) in the same way that it would address this issue. It feels like the logic in those two sections should be similar.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.
+1
I think I'm running into this, or at least something very similar. I'm trying to set up aider which uses litellm to connect to Ollama, but it doesn't work using OLLAMA_API_TOKEN. Calls to /api/show fail with this error. Calls to /api/chat succeed.
The calls to /api/show are failing because my reverse proxy server returns 403 (forbidden) for requests that don't have an authorization token. Logs from my reverse proxy show the Authorization header is missing for those requests.
To make it work for now I'm running an additional reverse proxy alongside aider. It adds the Authorization header to every request before passing it upstream and that allows aider / litellm to make unauthenticated (local) calls.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.
+1
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.
Please keep this issue alive
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.