litellm icon indicating copy to clipboard operation
litellm copied to clipboard

[Bug]: Support no_proxy env to use openai compatible api from internal behind corporate firewall

Open VfBfoerst opened this issue 1 year ago • 11 comments

What happened?

A bug happened!

Hello there :)

As discussed in #1458 , the proxy is already supported. We can confirm that, while we tried to configure VLLM locally. Regarding to the Docs, vllm is set up like that:

  - model_name: vllm-models
    litellm_params:
      model: openai/facebook/opt-125m # the `openai/` prefix tells litellm it's openai compatible
      api_base: http://123.123.123.123:8000
      rpm: 1440
    model_info: 
      version: 2

We get an answer from our proxy despite the no_proxy and NO_PROXY envs being set, so the code seems to ignore them. So supporting the no_proxy envs will be needed by every locally hosted openai compatible api server.

Thanks for listening :)

Relevant log output

We got an http response from the proxy which shouldn't be requested.

Twitter / LinkedIn details

No response

VfBfoerst avatar Feb 22 '24 10:02 VfBfoerst

@VfBfoerst what is NO_PROXY? can you share more re: requested implementation?

krrishdholakia avatar Feb 22 '24 15:02 krrishdholakia

@VfBfoerst what is NO_PROXY? can you share more re: requested implementation?

@krrishdholakia no_proxy environment variable in Linux is used to define exceptions (IPs, URLs) to Not use the Proxy for.

e.g. in a typical Corporate Network Setup, the Packet flow for an application (in this Case litellm) which wants to Connect to the Internet (in this Case openai):

LiteLLM -> Corporate Proxy/Firewall -> OpenAI

When a Proxy ist Set as in the mentioned issue, the Proxy will be used for every http request. Since the Proxy is only for outgoing requests, it typically won't "forward" to internal IPs/URLs. So you need to Set exceptions. Lets say we host an openai compatible API on 172.30.2.2;

Without no_proxy:

LiteLLM -> Corporate Proxy/Firewall -> Dead, cant Route to 172.30.2.2

With no_proxy:

export no_proxy=172.30.2.2

LiteLLM -> 172.30.2.2

(I ignored a few steps, that Workflow is sumed Up)

TL;Dr:

If you Set a Proxy, all http requests will be sent to the Proxy. So you need to define exceptions for internal addresses. For this the no_proxy and NO_PROXY Environment variables exist. The application should Accept both, no_proxy and NO_PROXY.

VfBfoerst avatar Feb 22 '24 19:02 VfBfoerst

  - model_name: vllm-models
    litellm_params:
      model: openai/facebook/opt-125m # the `openai/` prefix tells litellm it's openai compatible
      api_base: http://123.123.123.123:8000
      rpm: 1440
      no_proxy: true
    model_info: 
      version: 2

krrishdholakia avatar Mar 14 '24 15:03 krrishdholakia

actually httpx has a cleaner implementation here - https://www.python-httpx.org/environment_variables/#no_proxy

krrishdholakia avatar Mar 14 '24 15:03 krrishdholakia

this is now supported @VfBfoerst

krrishdholakia avatar Mar 15 '24 14:03 krrishdholakia

just set a comma separated list as an environment variable and it should get picked up by the proxy

krrishdholakia avatar Mar 15 '24 14:03 krrishdholakia

@krrishdholakia the environment variable seems not to be passed through correctly.
The no_proxy env is defined as follows:
no_proxy=123.123.123.123,123.123.123.123,123.123.123.123

which leads to a value error:

Traceback (most recent call last):
  File "/usr/local/bin/litellm", line 8, in <module>
    sys.exit(run_server())
  File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
  File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/local/lib/python3.9/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/litellm/proxy/proxy_cli.py", line 437, in run_server
    _, _, general_settings = asyncio.run(
  File "/usr/local/lib/python3.9/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.9/asyncio/base_events.py", line 647, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.9/site-packages/litellm/proxy/proxy_server.py", line 2049, in load_config
    router = litellm.Router(**router_params)  # type:ignore
  File "/usr/local/lib/python3.9/site-packages/litellm/router.py", line 195, in __init__
    self.set_model_list(model_list)
  File "/usr/local/lib/python3.9/site-packages/litellm/router.py", line 2089, in set_model_list
    self.set_client(model=model)
  File "/usr/local/lib/python3.9/site-packages/litellm/router.py", line 1948, in set_client
    http_client=httpx.AsyncClient(
  File "/usr/local/lib/python3.9/site-packages/httpx/_client.py", line 1458, in __init__
    {URLPattern(key): transport for key, transport in mounts.items()}
  File "/usr/local/lib/python3.9/site-packages/httpx/_client.py", line 1458, in <dictcomp>
    {URLPattern(key): transport for key, transport in mounts.items()}
  File "/usr/local/lib/python3.9/site-packages/httpx/_utils.py", line 364, in __init__
    raise ValueError(
ValueError: Proxy keys should use proper URL forms rather than plain scheme strings. Instead of "123.123.123.123", use "123.123.123.123://"

VfBfoerst avatar Mar 18 '24 10:03 VfBfoerst

@krrishdholakia Can you reopen the issue please?

VfBfoerst avatar Mar 20 '24 07:03 VfBfoerst

@VfBfoerst the error raised

ValueError: Proxy keys should use proper URL forms rather than plain scheme strings. Instead of "123.123.123.123", use "123.123.123.123://"

seems to imply the no_proxy value set is not correct.

These are set for httpx clients - https://www.python-httpx.org/environment_variables/#no_proxy

Can you let me know what the correct value should be? If you have a working example i can add the relevant checks on our end.

krrishdholakia avatar Apr 06 '24 16:04 krrishdholakia

@krrishdholakia no_proxy is usually set like:

no_proxy=123.123.123.123,localhost,127.0.0.1,test.de,host.containers.internal

in the httpx documentation no_proxy is set like "http://123.123.123.123", so maybe it will fix the issue if you set http:// in front of the ip if it isn't set already.

VfBfoerst avatar Apr 08 '24 07:04 VfBfoerst

@VfBfoerst happy to add the check, if you can confirm the change works for you

krrishdholakia avatar Apr 08 '24 12:04 krrishdholakia

@VfBfoerst happy to add the check, if you can confirm the change works for you

This change does not solve the conflict and gives the same error

Suvoo avatar Jun 05 '24 22:06 Suvoo

I have the same issue

~ $ export NO_PROXY="127.0.0.1/8,169.254.169.254/32"
~ $ export HTTPS_PROXY="http://127.0.0.1:8080"
~ $ export HTTP_PROXY="http://127.0.0.1:8080"
~ $
~ $
~ $ litellm --port 4001 --host 127.0.0.1 --telemetry False --config /litellm/config.yml
Traceback (most recent call last):
  File "/usr/local/bin/litellm", line 5, in <module>
    from litellm import run_server
  File "/usr/local/lib/python3.11/site-packages/litellm/__init__.py", line 240, in <module>
    module_level_aclient = AsyncHTTPHandler(timeout=request_timeout)
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/llms/custom_httpx/http_handler.py", line 16, in __init__
    self.client = self.create_client(
                  ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/llms/custom_httpx/http_handler.py", line 52, in create_client
    return httpx.AsyncClient(
           ^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/httpx/_client.py", line 1458, in __init__
    {URLPattern(key): transport for key, transport in mounts.items()}
  File "/usr/local/lib/python3.11/site-packages/httpx/_client.py", line 1458, in <dictcomp>
    {URLPattern(key): transport for key, transport in mounts.items()}
     ^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/httpx/_utils.py", line 364, in __init__
    raise ValueError(
ValueError: Proxy keys should use proper URL forms rather than plain scheme strings. Instead of "127.0.0.1/8", use "127.0.0.1/8://"

If I'm setting just HTTPS_PROXY and NO_PROXY or HTTP_PROXY and NO_PROXY - no error

Even when I run the httpx function directly - everything works perfectly.

~ $ export NO_PROXY="127.0.0.1/8,169.254.169.254/32,www.python-httpx.org"
~ $ export HTTPS_PROXY="http://127.0.0.1:8080"
~ $ export HTTP_PROXY="http://127.0.0.1:8080"
~ $ python -c "import httpx; httpx.get('https://www.python-httpx.org')"
~ $

sabretus avatar Jun 19 '24 10:06 sabretus

According to httpx documentation NO_PROXY should be set like this export NO_PROXY="all://127.0.0.1,all://169.254.169.254" but looks like it's ignored by boto3 as I can see requests still going to 169.254.169.254 in my proxy logs

127.0.0.1 169.254.169.254 [2024-06-19T11:56:09+00:00] 0.001 "GET http://169.254.169.254/latest/meta-data/iam/security-credentials/myInstanceRole HTTP/1.1" 200 1570 - "-" "Boto3/1.34.127 Python/3.11.9 Linux/6.5.0-1018-aws Botocore/1.34.127" -

Tthis comment may be the explanation, but I don't understand how to fix this

sabretus avatar Jun 19 '24 12:06 sabretus

hey @sabretus how are you calling litellm? steps to Repro would be helpful

here's the relevant boto3 code i think might be affecting you - https://github.com/BerriAI/litellm/blob/2834b5e7ee3bf1cc5ac1ff94dd8a4714ca5fce60/litellm/llms/bedrock_httpx.py#L291

krrishdholakia avatar Jun 19 '24 17:06 krrishdholakia

i think with boto3 you can set the proxy information in your aws config file and it should get that

https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html#using-environment-variables

krrishdholakia avatar Jun 19 '24 17:06 krrishdholakia

Thanks for the response @krrishdholakia

These are actually two separate issues.

First is that I can't specify NO_PROXY with regular CIDR formatted ip address - I have posted steps to reproduce above

  1. In Alpine container set env variables
  2. Run litellm
  3. See error

Second my AWS instance has assigned IAM role to access bedrock:*. When I'm calling model over lightllm, It's getting temporary credentials using this role (exactly with method you specified in comment) like this

GET http://169.254.169.254/latest/meta-data/iam/security-credentials/myInstanceRole

I have seen the documentation and problem is that I don't see how to set no_proxy there, I mean I want lightllm to ignore proxy for ip 169.254.169.254 so it can be accessed directly as in fact this is reserved local meta ip, similar to 127.0.0.1. It looks like boto3 removed this option completely and I can't find workaround, my lightllm is calling 169.254.169.254 over proxy and this is not right.

sabretus avatar Jun 20 '24 12:06 sabretus

got it - is there a way to set 'no_proxy' on boto3? @sabretus

happy to expose that as a param and pass that through

krrishdholakia avatar Jun 20 '24 14:06 krrishdholakia

@krrishdholakia I upgraded the package still facing this error, could you please check

ValueError: Proxy keys should use proper URL forms rather than plain scheme strings. Instead of "10.119.112.43", use "10.119.112.43://"

sarathsurpur avatar Jul 01 '24 06:07 sarathsurpur

Hey @sarathsurpur put out a dev release with fix. Can you confirm it works for you - https://github.com/BerriAI/litellm/releases/tag/v1.41.3.dev3

krrishdholakia avatar Jul 03 '24 01:07 krrishdholakia

I am running LiteLLM 1.41.11 behind a corporate proxy as well, with no_proxy=10.1.10.1,[etc.] and facing the same problem mentioned above regarding the http scheme. Adding http://10.1.10.1 makes the error go away, but litellm ends up hitting the corporate proxy again. From the stack trace, I see that there is some manual handling of proxy settings within LiteLLM before they are passed down to httpx.

Happy to help in sharing repo or testing.

Note, I tried a workaround to disable the proxy by setting it to empty for the specific command http_proxy= [COMMAND], but this resulted in an error where LiteLLM was trying to parse the empty string.

What finally worked was unset http_proxy; unset HTTP_PROXY; [etc.] [COMMAND]

mbbyn avatar Jul 08 '24 15:07 mbbyn

Hey @mbbyn this should be fixed once v1.41.12 is live in prod

If you want to test it now, can you try it with the dev release - v1.41.11.dev5 and let me know if it's resolved

krrishdholakia avatar Jul 08 '24 15:07 krrishdholakia

Justed tested v1.41.13, and it works great 👍 Kudos

mbbyn avatar Jul 09 '24 07:07 mbbyn

great @sarathsurpur @sabretus @VfBfoerst let me know if y'all face any issues from v1.41.13 onwards

krrishdholakia avatar Jul 09 '24 15:07 krrishdholakia