litellm icon indicating copy to clipboard operation
litellm copied to clipboard

[Bug]: Unable to use Azure OPENAI

Open benahmedadel opened this issue 1 year ago • 3 comments

What happened?

I am unable to link to Azure OPENAI using litellm. (see log) This is my litellm configuration. I use 3 different ways for 3 models to test them, everytime i have the same error I tested key, version and endpoint using python code and it works.

LiteLLM: Current Version = 1.34.34

  - litellm_params: 
      model: "gpt-4"
      api_key: "XXXXXXXXXXXXXXXXXXXXXXX"
      api_base: "https://<my_endpoint>.openai.azure.com/"
      api_version: "2024-02-01"
      temperature: 0.3
      max_tokens: 1024
    model_name: "gpt-4"
  - litellm_params: 
      model: "azure/gpt-3.5-turbo-instruct-0914"
      api_key: "XXXXXXXXXXXXXXXXXXXXXXX"
      api_base: "https://<my_endpoint>.openai.azure.com/"
      api_version: "2024-02-01"
      temperature: 0.3
      max_tokens: 1024
    model_name: "gpt-3.5-turbo-instruct-0914"
  - model_name: gpt-3.5-turbo
    litellm_params:
      model: azure/gpt-turbo-small-eu
      api_base: https://<my_endpoint>.openai.azure.com/
      api_key: "os.environ/AZURE_API_KEY"
      rpm: 6  

Relevant log output

Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]: Traceback (most recent call last):
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     yield
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/httpx/_transports/default.py", line 373, in handle_async_request
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     resp = await self._pool.handle_async_request(req)
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/httpcore/_async/connection_pool.py", line 216, in handle_async_request
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     raise exc from None
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/httpcore/_async/connection_pool.py", line 196, in handle_async_request
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     response = await connection.handle_async_request(
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/httpcore/_async/connection.py", line 99, in handle_async_request
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     raise exc
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/httpcore/_async/connection.py", line 76, in handle_async_request
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     stream = await self._connect(request)
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/httpcore/_async/connection.py", line 154, in _connect
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     stream = await stream.start_tls(**kwargs)
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/httpcore/_backends/anyio.py", line 66, in start_tls
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     with map_exceptions(exc_map):
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/usr/lib/python3.10/contextlib.py", line 153, in __exit__
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     self.gen.throw(typ, value, traceback)
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     raise to_exc(exc) from exc
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]: httpcore.ConnectError
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]: The above exception was the direct cause of the following exception:
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]: Traceback (most recent call last):
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/openai/_base_client.py", line 1475, in _request
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     response = await self._client.send(
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/httpx/_client.py", line 1661, in send
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     response = await self._send_handling_auth(
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/httpx/_client.py", line 1689, in _send_handling_auth
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     response = await self._send_handling_redirects(
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/httpx/_client.py", line 1726, in _send_handling_redirects
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     response = await self._send_single_request(request)
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/httpx/_client.py", line 1763, in _send_single_request
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     response = await transport.handle_async_request(request)
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/litellm/llms/custom_httpx/azure_dall_e_2.py", line 66, in handle_async_request
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     return await super().handle_async_request(request)
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/httpx/_transports/default.py", line 372, in handle_async_request
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     with map_httpcore_exceptions():
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/usr/lib/python3.10/contextlib.py", line 153, in __exit__
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     self.gen.throw(typ, value, traceback)
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     raise mapped_exc(message) from exc
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]: httpx.ConnectError
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]: The above exception was the direct cause of the following exception:
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]: Traceback (most recent call last):
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/litellm/llms/openai.py", line 561, in async_streaming
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     response = await openai_aclient.chat.completions.create(
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 1334, in create
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     return await self._post(
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/openai/_base_client.py", line 1743, in post
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/openai/_base_client.py", line 1446, in request
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     return await self._request(
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/openai/_base_client.py", line 1499, in _request
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     return await self._retry_request(
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/openai/_base_client.py", line 1568, in _retry_request
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     return await self._request(
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/openai/_base_client.py", line 1499, in _request
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     return await self._retry_request(
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/openai/_base_client.py", line 1568, in _retry_request
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     return await self._request(
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/openai/_base_client.py", line 1509, in _request
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     raise APIConnectionError(request=request) from err
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]: openai.APIConnectionError: Connection error.
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]: During handling of the above exception, another exception occurred:
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]: Traceback (most recent call last):
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/litellm/main.py", line 317, in acompletion
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     response = await init_response
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/litellm/llms/openai.py", line 585, in async_streaming
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     raise OpenAIError(status_code=500, message=f"{str(e)}")
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]: litellm.llms.openai.OpenAIError: Connection error.
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]: During handling of the above exception, another exception occurred:
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]: Traceback (most recent call last):
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/litellm/router.py", line 1239, in async_function_with_retries
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     response = await original_function(*args, **kwargs)
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/litellm/router.py", line 485, in _acompletion
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     raise e
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/litellm/router.py", line 461, in _acompletion
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     response = await litellm.acompletion(
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/litellm/utils.py", line 3411, in wrapper_async
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     raise e
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/litellm/utils.py", line 3243, in wrapper_async
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     result = await original_function(*args, **kwargs)
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/litellm/main.py", line 330, in acompletion
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     raise exception_type(
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/litellm/utils.py", line 8526, in exception_type
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     raise e
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/litellm/utils.py", line 7390, in exception_type
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     raise APIError(
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]: litellm.exceptions.APIError: OpenAIException - Connection error.
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]: During handling of the above exception, another exception occurred:
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]: Traceback (most recent call last):
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/litellm/proxy/proxy_server.py", line 3439, in chat_completion
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     responses = await asyncio.gather(
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/litellm/router.py", line 402, in acompletion
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     raise e
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/litellm/router.py", line 398, in acompletion
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     response = await self.async_function_with_fallbacks(**kwargs)
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/litellm/router.py", line 1222, in async_function_with_fallbacks
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     raise original_exception
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/litellm/router.py", line 1143, in async_function_with_fallbacks
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     response = await self.async_function_with_retries(*args, **kwargs)
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/litellm/router.py", line 1331, in async_function_with_retries
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     raise e
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/litellm/router.py", line 1293, in async_function_with_retries
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     response = await original_function(*args, **kwargs)
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/litellm/router.py", line 485, in _acompletion
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     raise e
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/litellm/router.py", line 410, in _acompletion
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     deployment = self.get_available_deployment(
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:   File "/appli/litellm/venv/lib/python3.10/site-packages/litellm/router.py", line 2443, in get_available_deployment
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]:     raise ValueError(f"No healthy deployment available, passed model={model}")
Apr 08 08:06:58 LITELLM_SERVER start_litellm.sh[2819225]: ValueError: No healthy deployment available, passed model=gpt-4

Twitter / LinkedIn details

No response

benahmedadel avatar Apr 08 '24 06:04 benahmedadel

Hi @benahmedadel is this on a locally running or deployed proxy server ?

ishaan-jaff avatar Apr 09 '24 23:04 ishaan-jaff

It is running on a ubuntu server where i have litellm. I wanted to check and test with Azure openai using litellm but it is KO. It is not very important for me because it is simply a test, but i wanted you be aware of that.

benahmedadel avatar Apr 09 '24 23:04 benahmedadel

@benahmedadel free to debug this together ? - happy to help

I'm on here if you're free https://meet.google.com/ezd-xomn-fgc

ishaan-jaff avatar Apr 10 '24 00:04 ishaan-jaff

hi @benahmedadel wanted to follow up on this, it's a P0 for us to solve this issue and it would really help to get on a call to better understand your environment. How's 9am PT April 10th ?

Sharing a link to my personal calendar for your convenience: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

ishaan-jaff avatar Apr 10 '24 04:04 ishaan-jaff

Hello

I really appreciate your suggestion. I am sorry today it is a muslim holiday, i am very far of technology since 8 hours. Can we program a call saturday please? I can show you logs and litellm config.

Thanks

Le mer. 10 avr. 2024 à 06:36, Ishaan Jaff @.***> a écrit :

hi @benahmedadel https://github.com/benahmedadel wanted to follow up on this, it's a P0 for us to solve this issue and it would really help to get on a call to better understand your environment. How's 9am PT April 10th ?

Sharing a link to my personal calendar for your convenience: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

— Reply to this email directly, view it on GitHub https://github.com/BerriAI/litellm/issues/2894#issuecomment-2046524157, or unsubscribe https://github.com/notifications/unsubscribe-auth/ANQHYRLLLSVHWX5TJFEAWR3Y4S6VLAVCNFSM6AAAAABF4B3PAWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDANBWGUZDIMJVG4 . You are receiving this because you were mentioned.Message ID: @.***>

benahmedadel avatar Apr 10 '24 11:04 benahmedadel

Hello, this issue can be closed, it was something very stupid. I just forgot to add proxy environment variables to service starting litellm. Everything is ok right now.

benahmedadel avatar Apr 10 '24 22:04 benahmedadel

@benahmedadel what environment variables were missing / incorrect ? this will help other users

ishaan-jaff avatar Apr 10 '24 22:04 ishaan-jaff

Can we program a call saturday please?

Can we setup a call for Saturday ? I'd love to learn how you use the proxy. What email can I send an invite to?

ishaan-jaff avatar Apr 10 '24 22:04 ishaan-jaff

@benahmedadel what environment variables were missing / incorrect ? this will help other users

I added 4 variables to my service launcher: http_proxy, https_proxy, HTTP_PROXY and HTTPS_PROXY This can be done using also export var=value in shell script.

Description=Litellm
After=network-online.target ollama
[Service]
User=litellm
Group=litellm
Type=simple
ExecStart=/appli/litellm/start_litellm.sh
Environment="TMPDIR=/appli/tmp"
Environment="HTTPS_PROXY=http://127.0.0.1:3128"
Environment="HTTP_PROXY=http://127.0.0.1:3128"
Environment="https_proxy=http://127.0.0.1:3128"
Environment="http_proxy=http://127.0.0.1:3128"
Environment="no_proxy=127.0.0.1,0.0.0.0,X.X.X.X"
[Install]
WantedBy=multi-user.target

benahmedadel avatar Apr 10 '24 22:04 benahmedadel

  • can you help me understand why you needed to add http_proxy, https_proxy, HTTP_PROXY and HTTPS_PROXY to your env ?
  • it was fixed after adding these vars ?

some other users are facing this and I'm trying to understand this better

ishaan-jaff avatar Apr 10 '24 22:04 ishaan-jaff

  • can you help me understand why you needed to add http_proxy, https_proxy, HTTP_PROXY and HTTPS_PROXY to your env ?
  • it was fixed after adding these vars ?

some other users are facing this and I'm trying to understand this better

I have CNTLM proxy installed on my test server. Without it, i cannot use internet. So, on ubuntu, i have to set these 4 vars to do operations like wget, apt-get, ... Yes as i told you, the issue is fixed. LiteLLM is top working, the problem was with my environment vars.

benahmedadel avatar Apr 10 '24 22:04 benahmedadel