litellm
litellm copied to clipboard
[Feature]: Support OIDC authentication to upstream APIs
The Feature
https://cloud.google.com/run/docs/authenticating/service-to-service#run-service-to-service-example-python
e.g. using OIDC from LiteLLM running in a Google Cloud Run instance to authenticate to Amazon Bedrock.
https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_providers_create_oidc.html
Motivation, pitch
This would remove the need for private/secret keys to access Amazon Bedrock, Google Vertex AI, and/or Azure OpenAI.
This would work for people running LiteLLM in:
- Google Cloud Run
- GitHub Actions
- and lots more probably that I don't remember off the top of my head
Twitter / LinkedIn details
https://www.linkedin.com/in/davidmanouchehri/
Hey @Manouchehri how does this work exactly?
So there's an endpoint we call to get a token? And then use that for making authenticated requests?
I believe this is how the anthropic vertexai integration works today
https://github.com/BerriAI/litellm/blob/5baeeec899fc5f6ce9869bc68d4a81289494a7e7/litellm/llms/vertex_ai_anthropic.py#L249
you can test this by trying to add vertex ai models on the UI -> just give us a service account credentials and we can use that to make requests
I'm working on this now, will send a draft pull request soon. =)
Completed in #3507.