requests icon indicating copy to clipboard operation
requests copied to clipboard

no/no_proxy is not honoured

Open Suika opened this issue 3 years ago • 8 comments

I guess PRs are overlooked without an Issues. It's about #5596 and the way handle no_proxy. Since urllib handles no_proxy properly, it's the logic in requests that messes with the env in a seemingly twisted way.

Expected Result

The ability to use no_proxy vairable via OS and function arguments.

Actual Result

Only OS no_proxy ENV is being processed. Uncer certain conditions that were described multiple times in the no_proxy issues.

System Information

$ python -m requests.help
{
  "chardet": {
    "version": "3.0.4"
  },
  "cryptography": {
    "version": "2.9.2"
  },
  "idna": {
    "version": "2.9"
  },
  "implementation": {
    "name": "CPython",
    "version": "3.8.5"
  },
  "platform": {
    "release": "4.15.18-10-pve",
    "system": "Linux"
  },
  "pyOpenSSL": {
    "openssl_version": "1010107f",
    "version": "19.1.0"
  },
  "requests": {
    "version": "2.23.0"
  },
  "system_ssl": {
    "version": "1010107f"
  },
  "urllib3": {
    "version": "1.25.9"
  },
  "using_pyopenssl": true
}

Suika avatar Jan 23 '21 19:01 Suika

I have the same issue: no_proxy is ignored in a simple requests.get() call:

import requests

proxies = {
  'http': 'proxy.example.com',
  'no_proxy': 'google.com'
}

requests.get('http://google.com/', proxies=proxies)

With 2.28.0 this yields:

requests.exceptions.ProxyError: HTTPConnectionPool(host='proxy.example.com', port=80): Max retries exceeded with url: http://google.com/ (Caused by ProxyError('Cannot connect to proxy.', NewConnectionError('<urllib3.connection.HTTPConnection object at 0x10418f5b0>: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known')))

Set the following in bash:

export http_proxy="proxy.example.com"
export no_proxy="google.com"

and the requests.get('http://google.com/') works just fine.

RichieB2B avatar Jun 20 '22 16:06 RichieB2B

Where is it documented that what you believe is a big should work? At no point has this library supported what you've made up

sigmavirus24 avatar Jun 20 '22 20:06 sigmavirus24

From the 2.14.0 release history "Improvements" section:

- It is now possible to pass ``no_proxy`` as a key to the ``proxies`` dictionary to provide handling similar to the ``NO_PROXY`` environment variable.

RichieB2B avatar Jun 20 '22 21:06 RichieB2B

So something from ages ago that is likely not documented elsewhere?

sigmavirus24 avatar Jun 21 '22 02:06 sigmavirus24

It all comes down to feature request https://github.com/psf/requests/issues/2817 and the implementation https://github.com/psf/requests/commit/85400d8d6751071ef78f042d1efa72bdcf76cc0e not actually working. If you think this is not a bug I'd happily create a documentation PR instead.

RichieB2B avatar Jun 21 '22 06:06 RichieB2B

I did some more testing and was able to make an exception using the per-host proxy settings:

proxies = {
  'http': 'http://proxy.example.com',
  'http://google.com': '',
}

This is more flexible than the no_proxy mechanism so I can live with it not working as described.

I've made a PR to clarify this in the documentation at #6172

RichieB2B avatar Jun 21 '22 19:06 RichieB2B

no_proxy doesn't work

python3 -c 'import requests; print(requests.get("http://ipinfo.io/ip", proxies={"no_proxy":"ipinfo.io","http":"http://myproxy:3128"}).text)'

no_proxy works

export no_proxy=ipinfo.io
export http_proxy=http://myproxy:3128

python3 -c 'import requests; print(requests.get("http://ipinfo.io/ip").text)'

The difference is caused by this if statement:

        if "proxies" not in kwargs:
            kwargs["proxies"] = resolve_proxies(request, self.proxies, self.trust_env)

no_proxy is handled by resolve_proxies()

https://github.com/psf/requests/blob/main/src/requests/sessions.py#L683-L684

vimagick avatar Nov 23 '23 07:11 vimagick