litellm
litellm copied to clipboard
[Bug]: Unable to use custom `base_url` for some `llm_provider`s (`palm`, `gemini`, `vertex_ai`)
What happened?
Using gemini-pro as a litellm proxy is currently impossible.
Some llm_providers
such as palm
, gemini
, vertex_ai
is ignoring the base_url
argument.
https://github.com/BerriAI/litellm/blob/ef4c85522c001c930f02e2ec2c32dea9a7816b74/litellm/main.py#L1629
Relevant log output
>>> litellm.completion(base_url=LITELLM_BASE_URL, model="gemini-pro", api_key=os.environ["LITELLM_API_KEY"], messages=[{"role": "user", "content": "Who are you?"}], custom_llm_provider="gemini")
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
Traceback (most recent call last):
File "/home/ironore15/.local/lib/python3.11/site-packages/litellm/llms/gemini.py", line 216, in completion
response = _model.generate_content(
^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ironore15/.local/lib/python3.11/site-packages/google/generativeai/generative_models.py", line 262, in generate_content
response = self._client.generate_content(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ironore15/.local/lib/python3.11/site-packages/google/ai/generativelanguage_v1beta/services/generative_service/client.py", line 791, in generate_content
response = rpc(
^^^^
File "/home/ironore15/.local/lib/python3.11/site-packages/google/api_core/gapic_v1/method.py", line 131, in __call__
return wrapped_func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ironore15/.local/lib/python3.11/site-packages/google/api_core/retry/retry_unary.py", line 293, in retry_wrapped_func
return retry_target(
^^^^^^^^^^^^^
File "/home/ironore15/.local/lib/python3.11/site-packages/google/api_core/retry/retry_unary.py", line 153, in retry_target
_retry_error_helper(
File "/home/ironore15/.local/lib/python3.11/site-packages/google/api_core/retry/retry_base.py", line 212, in _retry_error_helper
raise final_exc from source_exc
File "/home/ironore15/.local/lib/python3.11/site-packages/google/api_core/retry/retry_unary.py", line 144, in retry_target
result = target()
^^^^^^^^
File "/home/ironore15/.local/lib/python3.11/site-packages/google/api_core/timeout.py", line 120, in func_with_timeout
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/ironore15/.local/lib/python3.11/site-packages/google/api_core/grpc_helpers.py", line 78, in error_remapped_callable
raise exceptions.from_grpc_error(exc) from exc
google.api_core.exceptions.InvalidArgument: 400 API key not valid. Please pass a valid API key. [reason: "API_KEY_INVALID"
domain: "googleapis.com"
metadata {
key: "service"
value: "generativelanguage.googleapis.com"
}
]
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/ironore15/.local/lib/python3.11/site-packages/litellm/main.py", line 1629, in completion
model_response = gemini.completion(
^^^^^^^^^^^^^^^^^^
File "/home/ironore15/.local/lib/python3.11/site-packages/litellm/llms/gemini.py", line 222, in completion
raise GeminiError(
litellm.llms.gemini.GeminiError: 400 API key not valid. Please pass a valid API key. [reason: "API_KEY_INVALID"
domain: "googleapis.com"
metadata {
key: "service"
value: "generativelanguage.googleapis.com"
}
]
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/ironore15/.local/lib/python3.11/site-packages/litellm/utils.py", line 2947, in wrapper
raise e
File "/home/ironore15/.local/lib/python3.11/site-packages/litellm/utils.py", line 2845, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ironore15/.local/lib/python3.11/site-packages/litellm/main.py", line 2119, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "/home/ironore15/.local/lib/python3.11/site-packages/litellm/utils.py", line 8532, in exception_type
raise e
File "/home/ironore15/.local/lib/python3.11/site-packages/litellm/utils.py", line 8500, in exception_type
raise APIConnectionError(
litellm.exceptions.APIConnectionError: 400 API key not valid. Please pass a valid API key. [reason: "API_KEY_INVALID"
domain: "googleapis.com"
metadata {
key: "service"
value: "generativelanguage.googleapis.com"
}
]
Twitter / LinkedIn details
No response
Hey @ironore15 the 'base_url' param is for azure openai
You don't need to set base_url for palm/gemini/vertex ai.
We use the google libraries which handle the url construction for this. Is there something i'm missing here?
Here's how to make the call - https://docs.litellm.ai/docs/providers/vertex#usage-with-litellm-proxy-server
We might want base_url
support for Vertex AI, since Cloudflare AI Gateway is one use-case for it.
https://developers.cloudflare.com/ai-gateway/providers/vertex/
It don't work withiopenrouter too.
Use case: Use base_url to integrate with helicone auto proxy.
See #3732 for base_url
tracking Vertex AI support with Cloudflare AI Gateway.
setting base_url
is now supported for gemini + vertex_ai (for vertex_ai_beta/
calls).
https://github.com/BerriAI/litellm/blob/0cd25c250d75ebdb26955338ba2d2a178257e93e/litellm/llms/vertex_httpx.py#L841
@ironore15 can we do a 10min call? Would love to learn how you're using litellm, so we can improve
https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat