cheroot icon indicating copy to clipboard operation
cheroot copied to clipboard

test_tls_client_auth[VerifyMode.CERT_{REQUIRED,OPTIONAL}-False-localhost-builtin] fails on macOS

Open jaraco opened this issue 5 years ago • 5 comments

Originally reported in #225, two tests fail reliably for me on macOS.

On my local workstation, here is the test output.

python develop-inst-noop: /Users/jaraco/code/public/cherrypy/cheroot
python installed: apipkg==1.5,argh==0.26.2,asn1crypto==1.0.1,atomicwrites==1.3.0,attrs==19.2.0,certifi==2019.9.11,cffi==1.12.3,chardet==3.0.4,-e git+gh://cherrypy/cheroot@4fa4f701d828ff296cf9801f9d0fadf64c303a5e#egg=cheroot,codecov==2.0.15,colorama==0.4.1,coverage==4.5.3,cryptography==2.7,docopt==0.6.2,execnet==1.7.1,idna==2.8,jaraco.functools==2.0,more-itertools==7.2.0,packaging==19.2,pathtools==0.1.2,pluggy==0.13.0,py==1.8.0,pycparser==2.19,pyOpenSSL==19.0.0,pyparsing==2.4.2,pytest==5.2.1,pytest-cov==2.7.1,pytest-forked==1.1.1,pytest-mock==1.10.4,pytest-sugar==0.9.2,pytest-testmon==0.9.19,pytest-watch==4.2.0,pytest-xdist==1.30.0,PyYAML==5.1.2,requests==2.22.0,requests-unixsocket==0.2.0,six==1.12.0,termcolor==1.1.0,trustme==0.5.2,urllib3==1.25.6,watchdog==0.9.0,wcwidth==0.1.7
python run-test-pre: PYTHONHASHSEED='1930311714'
python run-test: commands[0] | pytest --testmon-off
============================= test session starts ==============================
platform darwin -- Python 3.8.0rc1, pytest-5.2.1, py-1.8.0, pluggy-0.13.0 -- /Users/jaraco/code/public/cherrypy/cheroot/.tox/python/bin/python
cachedir: .tox/python/.pytest_cache
rootdir: /Users/jaraco/code/public/cherrypy/cheroot, inifile: pytest.ini, testpaths: cheroot/test/
plugins: testmon-0.9.19, xdist-1.30.0, forked-1.1.1, sugar-0.9.2, cov-2.7.1, mock-1.10.4
gw0 I / gw1 I / gw2 I / gw3 I

[gw0] darwin Python 3.8.0 cwd: /Users/jaraco/code/public/cherrypy/cheroot

[gw1] darwin Python 3.8.0 cwd: /Users/jaraco/code/public/cherrypy/cheroot

[gw2] darwin Python 3.8.0 cwd: /Users/jaraco/code/public/cherrypy/cheroot

[gw3] darwin Python 3.8.0 cwd: /Users/jaraco/code/public/cherrypy/cheroot

[gw0] Python 3.8.0rc1 (v3.8.0rc1:34214de6ab, Oct  1 2019, 12:56:49)  -- [Clang 6.0 (clang-600.0.57)]

[gw1] Python 3.8.0rc1 (v3.8.0rc1:34214de6ab, Oct  1 2019, 12:56:49)  -- [Clang 6.0 (clang-600.0.57)]

[gw2] Python 3.8.0rc1 (v3.8.0rc1:34214de6ab, Oct  1 2019, 12:56:49)  -- [Clang 6.0 (clang-600.0.57)]

[gw3] Python 3.8.0rc1 (v3.8.0rc1:34214de6ab, Oct  1 2019, 12:56:49)  -- [Clang 6.0 (clang-600.0.57)]
gw0 [112] / gw1 [112] / gw2 [112] / gw3 [112]

scheduling tests via LoadScheduling

cheroot/test/test__compat.py::test_compat_functions_positive[ntob-bar-bar] 
cheroot/test/test__compat.py::test_compat_functions_positive[ntou-bar-bar] 
cheroot/test/test__compat.py::test_compat_functions_positive[bton-bar-bar] 
cheroot/test/test__compat.py::test_compat_functions_negative_nonnative[ntob] 
[gw0] [  0%] PASSED cheroot/test/test__compat.py::test_compat_functions_positive[ntob-bar-bar] 
[gw1] [  1%] PASSED cheroot/test/test__compat.py::test_compat_functions_positive[ntou-bar-bar] 
cheroot/test/test__compat.py::test_compat_functions_negative_nonnative[ntou] 
[gw2] [  2%] PASSED cheroot/test/test__compat.py::test_compat_functions_positive[bton-bar-bar] 
[gw0] [  3%] PASSED cheroot/test/test__compat.py::test_compat_functions_negative_nonnative[ntou] 
cheroot/test/test__compat.py::test_ntou_escape 
cheroot/test/test__compat.py::test_extract_bytes[qwerty-qwerty] 
[gw1] [  4%] PASSED cheroot/test/test__compat.py::test_ntou_escape 
cheroot/test/test__compat.py::test_extract_bytes_invalid 
[gw0] [  5%] PASSED cheroot/test/test__compat.py::test_extract_bytes_invalid 
cheroot/test/test_conn.py::test_HTTP11_persistent_connections 
[gw2] [  6%] PASSED cheroot/test/test__compat.py::test_extract_bytes[qwerty-qwerty] 
cheroot/test/test_conn.py::test_streaming_10[False] 
cheroot/test/test_conn.py::test_streaming_11[False] 
[gw3] [  7%] PASSED cheroot/test/test__compat.py::test_compat_functions_negative_nonnative[ntob] 
cheroot/test/test__compat.py::test_extract_bytes[input_argument1-asdfgh] 
[gw3] [  8%] PASSED cheroot/test/test__compat.py::test_extract_bytes[input_argument1-asdfgh] 
cheroot/test/test_conn.py::test_streaming_11[True] 
[gw0] [  8%] PASSED cheroot/test/test_conn.py::test_streaming_10[False] 
cheroot/test/test_conn.py::test_keepalive_conn_management 
[gw3] [  9%] PASSED cheroot/test/test_conn.py::test_streaming_11[True] 
cheroot/test/test_conn.py::test_keepalive[HTTP/1.1] 
[gw2] [ 10%] PASSED cheroot/test/test_conn.py::test_streaming_11[False] 
[gw1] [ 11%] PASSED cheroot/test/test_conn.py::test_HTTP11_persistent_connections 
cheroot/test/test_conn.py::test_streaming_10[True] 
cheroot/test/test_conn.py::test_keepalive[HTTP/1.0] 
[gw3] [ 12%] PASSED cheroot/test/test_conn.py::test_keepalive[HTTP/1.1] 
cheroot/test/test_conn.py::test_HTTP11_Timeout_after_request 
[gw2] [ 13%] PASSED cheroot/test/test_conn.py::test_keepalive[HTTP/1.0] 
[gw1] [ 14%] PASSED cheroot/test/test_conn.py::test_streaming_10[True] 
cheroot/test/test_conn.py::test_HTTP11_Timeout[False] 
cheroot/test/test_conn.py::test_HTTP11_Timeout[True] 
[gw2] [ 15%] PASSED cheroot/test/test_conn.py::test_HTTP11_Timeout[False] 
[gw1] [ 16%] PASSED cheroot/test/test_conn.py::test_HTTP11_Timeout[True] 
cheroot/test/test_conn.py::test_readall_or_close[0] 
cheroot/test/test_conn.py::test_100_Continue 
[gw1] [ 16%] PASSED cheroot/test/test_conn.py::test_100_Continue 
cheroot/test/test_conn.py::test_Chunked_Encoding 
[gw2] [ 17%] PASSED cheroot/test/test_conn.py::test_readall_or_close[0] 
cheroot/test/test_conn.py::test_Content_Length_in 
[gw2] [ 18%] PASSED cheroot/test/test_conn.py::test_Content_Length_in 
cheroot/test/test_core.py::test_parse_no_leading_slash_invalid[\u043f\u0440\u0438\u0432\u0456\u0442] 
[gw1] [ 19%] XFAIL cheroot/test/test_conn.py::test_Chunked_Encoding 
[gw2] [ 20%] PASSED cheroot/test/test_core.py::test_parse_no_leading_slash_invalid[\u043f\u0440\u0438\u0432\u0456\u0442] 
cheroot/test/test_core.py::test_parse_uri_absolute_uri 
[gw2] [ 21%] PASSED cheroot/test/test_core.py::test_parse_uri_absolute_uri 
cheroot/test/test_core.py::test_parse_uri_asterisk_uri 
[gw2] [ 22%] PASSED cheroot/test/test_core.py::test_parse_uri_asterisk_uri 
cheroot/test/test_core.py::test_parse_uri_fragment_uri 
[gw2] [ 23%] PASSED cheroot/test/test_core.py::test_parse_uri_fragment_uri 
cheroot/test/test_core.py::test_no_content_length 
[gw2] [ 24%] PASSED cheroot/test/test_core.py::test_no_content_length 
cheroot/test/test_core.py::test_content_length_required 
[gw2] [ 25%] PASSED cheroot/test/test_core.py::test_content_length_required 
cheroot/test/test_core.py::test_malformed_request_line[GET /-400-Malformed Request-Line] 
[gw2] [ 25%] PASSED cheroot/test/test_core.py::test_malformed_request_line[GET /-400-Malformed Request-Line] 
cheroot/test/test_core.py::test_malformed_request_line[GET / HTTPS/1.1-400-Malformed Request-Line: bad protocol] 
[gw2] [ 26%] PASSED cheroot/test/test_core.py::test_malformed_request_line[GET / HTTPS/1.1-400-Malformed Request-Line: bad protocol] 
cheroot/test/test_core.py::test_malformed_request_line[GET / HTTP/1-400-Malformed Request-Line: bad version] 
[gw2] [ 27%] PASSED cheroot/test/test_core.py::test_malformed_request_line[GET / HTTP/1-400-Malformed Request-Line: bad version] 
cheroot/test/test_core.py::test_malformed_request_line[GET / HTTP/2.15-505-Cannot fulfill request] 
[gw2] [ 28%] PASSED cheroot/test/test_core.py::test_malformed_request_line[GET / HTTP/2.15-505-Cannot fulfill request] 
cheroot/test/test_core.py::test_malformed_http_method 
[gw2] [ 29%] PASSED cheroot/test/test_core.py::test_malformed_http_method 
cheroot/test/test_core.py::test_malformed_header 
[gw2] [ 30%] PASSED cheroot/test/test_core.py::test_malformed_header 
cheroot/test/test_core.py::test_request_line_split_issue_1220 
[gw1] [ 30%] XFAIL cheroot/test/test_conn.py::test_Chunked_Encoding 
cheroot/test/test_core.py::test_normal_request 
[gw2] [ 31%] PASSED cheroot/test/test_core.py::test_request_line_split_issue_1220 
cheroot/test/test_core.py::test_garbage_in 
[gw2] [ 32%] PASSED cheroot/test/test_core.py::test_garbage_in 
cheroot/test/test_core.py::test_send_header_before_closing 
[gw2] [ 33%] PASSED cheroot/test/test_core.py::test_send_header_before_closing 
cheroot/test/test_dispatch.py::test_dispatch_no_script_name 
[gw2] [ 33%] PASSED cheroot/test/test_dispatch.py::test_dispatch_no_script_name 
cheroot/test/test_errors.py::test_plat_specific_errors[err_names0-err_nums0] 
[gw2] [ 34%] PASSED cheroot/test/test_errors.py::test_plat_specific_errors[err_names0-err_nums0] 
cheroot/test/test_errors.py::test_plat_specific_errors[err_names1-err_nums1] 
[gw2] [ 35%] PASSED cheroot/test/test_errors.py::test_plat_specific_errors[err_names1-err_nums1] 
cheroot/test/test_makefile.py::test_bytes_read 
[gw2] [ 36%] PASSED cheroot/test/test_makefile.py::test_bytes_read 
cheroot/test/test_makefile.py::test_bytes_written 
[gw2] [ 37%] PASSED cheroot/test/test_makefile.py::test_bytes_written 
cheroot/test/test_server.py::test_prepare_makes_server_ready 
[gw2] [ 38%] PASSED cheroot/test/test_server.py::test_prepare_makes_server_ready 
cheroot/test/test_server.py::test_stop_interrupts_serve 
[gw1] [ 39%] PASSED cheroot/test/test_core.py::test_normal_request 
cheroot/test/test_core.py::test_query_string_request 
[gw1] [ 40%] PASSED cheroot/test/test_core.py::test_query_string_request 
cheroot/test/test_core.py::test_parse_acceptable_uri[/hello] 
[gw1] [ 41%] PASSED cheroot/test/test_core.py::test_parse_acceptable_uri[/hello] 
cheroot/test/test_core.py::test_parse_acceptable_uri[/query_string?test=True] 
[gw1] [ 41%] PASSED cheroot/test/test_core.py::test_parse_acceptable_uri[/query_string?test=True] 
cheroot/test/test_core.py::test_parse_acceptable_uri[/%D0%AE%D1%85%D1%85%D1%83%D1%83%D1%83?%D1%97=%D0%B9%D0%BE] 
[gw1] [ 42%] PASSED cheroot/test/test_core.py::test_parse_acceptable_uri[/%D0%AE%D1%85%D1%85%D1%83%D1%83%D1%83?%D1%97=%D0%B9%D0%BE] 
cheroot/test/test_core.py::test_parse_uri_unsafe_uri 
[gw1] [ 43%] PASSED cheroot/test/test_core.py::test_parse_uri_unsafe_uri 
cheroot/test/test_core.py::test_parse_uri_invalid_uri 
[gw1] [ 44%] PASSED cheroot/test/test_core.py::test_parse_uri_invalid_uri 
cheroot/test/test_core.py::test_parse_no_leading_slash_invalid[hello] 
[gw1] [ 45%] PASSED cheroot/test/test_core.py::test_parse_no_leading_slash_invalid[hello] 
cheroot/test/test_server.py::test_bind_addr_unix[file] 
[gw1] [ 46%] PASSED cheroot/test/test_server.py::test_bind_addr_unix[file] 
cheroot/test/test_server.py::test_bind_addr_unix_abstract 
[gw1] [ 47%] SKIPPED cheroot/test/test_server.py::test_bind_addr_unix_abstract 
cheroot/test/test_server.py::test_peercreds_unix_sock[abstract] 
[gw1] [ 48%] SKIPPED cheroot/test/test_server.py::test_peercreds_unix_sock[abstract] 
cheroot/test/test_server.py::test_peercreds_unix_sock[file] 
[gw1] [ 49%] SKIPPED cheroot/test/test_server.py::test_peercreds_unix_sock[file] 
cheroot/test/test_server.py::test_peercreds_unix_sock_with_lookup[abstract] 
[gw1] [ 50%] SKIPPED cheroot/test/test_server.py::test_peercreds_unix_sock_with_lookup[abstract] 
cheroot/test/test_server.py::test_peercreds_unix_sock_with_lookup[file] 
[gw1] [ 50%] SKIPPED cheroot/test/test_server.py::test_peercreds_unix_sock_with_lookup[file] 
cheroot/test/test_ssl.py::test_ssl_adapters[builtin] 
[gw2] [ 51%] PASSED cheroot/test/test_server.py::test_stop_interrupts_serve 
cheroot/test/test_server.py::test_bind_addr_inet[0.0.0.0] 
[gw2] [ 52%] PASSED cheroot/test/test_server.py::test_bind_addr_inet[0.0.0.0] 
cheroot/test/test_server.py::test_bind_addr_inet[::] 
[gw2] [ 53%] PASSED cheroot/test/test_server.py::test_bind_addr_inet[::] 
cheroot/test/test_server.py::test_bind_addr_unix[abstract] 
[gw2] [ 54%] SKIPPED cheroot/test/test_server.py::test_bind_addr_unix[abstract] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_NONE-True-localhost-builtin] 
[gw0] [ 55%] PASSED cheroot/test/test_conn.py::test_keepalive_conn_management 
cheroot/test/test_conn.py::test_HTTP11_pipelining 
[gw1] [ 56%] PASSED cheroot/test/test_ssl.py::test_ssl_adapters[builtin] 
cheroot/test/test_ssl.py::test_ssl_adapters[pyopenssl] 
[gw2] [ 57%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_NONE-True-localhost-builtin] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_NONE-True-localhost-pyopenssl] 
[gw1] [ 58%] PASSED cheroot/test/test_ssl.py::test_ssl_adapters[pyopenssl] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_NONE-True-127.0.0.1-pyopenssl] 
[gw0] [ 58%] PASSED cheroot/test/test_conn.py::test_HTTP11_pipelining 
cheroot/test/test_conn.py::test_No_Message_Body 
[gw0] [ 59%] PASSED cheroot/test/test_conn.py::test_No_Message_Body 
cheroot/test/test_conn.py::test_Content_Length_out[/wrong_cl_buffered-500-The requested resource returned more bytes than the declared Content-Length.] 
[gw0] [ 60%] PASSED cheroot/test/test_conn.py::test_Content_Length_out[/wrong_cl_buffered-500-The requested resource returned more bytes than the declared Content-Length.] 
cheroot/test/test_conn.py::test_Content_Length_out[/wrong_cl_unbuffered-200-I too] Traceback (most recent call last):
  File "/Users/jaraco/code/public/cherrypy/cheroot/cheroot/server.py", line 1280, in communicate
    req.respond()
  File "/Users/jaraco/code/public/cherrypy/cheroot/cheroot/server.py", line 1083, in respond
    self.server.gateway(self).respond()
  File "/Users/jaraco/code/public/cherrypy/cheroot/cheroot/wsgi.py", line 148, in respond
    self.write(chunk)
  File "/Users/jaraco/code/public/cherrypy/cheroot/cheroot/wsgi.py", line 239, in write
    raise ValueError(
ValueError: Response body exceeds the declared Content-Length.

[gw0] [ 61%] PASSED cheroot/test/test_conn.py::test_Content_Length_out[/wrong_cl_unbuffered-200-I too] 
cheroot/test/test_conn.py::test_598 
[gw3] [ 62%] PASSED cheroot/test/test_conn.py::test_HTTP11_Timeout_after_request 
cheroot/test/test_conn.py::test_readall_or_close[1001] 
[gw2] [ 63%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_NONE-True-localhost-pyopenssl] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_NONE-True-127.0.0.1-builtin] 
[gw3] [ 64%] PASSED cheroot/test/test_conn.py::test_readall_or_close[1001] 
cheroot/test/test_conn.py::test_Content_Length_not_int 
[gw3] [ 65%] PASSED cheroot/test/test_conn.py::test_Content_Length_not_int 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_NONE-False-localhost-builtin] 
[gw1] [ 66%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_NONE-True-127.0.0.1-pyopenssl] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_NONE-True-*.localhost-builtin] 
[gw2] [ 66%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_NONE-True-127.0.0.1-builtin] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_NONE-True-not_localhost-builtin] 
[gw0] [ 67%] XFAIL cheroot/test/test_conn.py::test_598 
cheroot/test/test_conn.py::test_No_CRLF[\n\n] 
[gw0] [ 68%] PASSED cheroot/test/test_conn.py::test_No_CRLF[\n\n] 
cheroot/test/test_conn.py::test_No_CRLF[\r\n\n] 
[gw0] [ 69%] PASSED cheroot/test/test_conn.py::test_No_CRLF[\r\n\n] 
cheroot/test/test_core.py::test_http_connect_request 
[gw0] [ 70%] PASSED cheroot/test/test_core.py::test_http_connect_request 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_OPTIONAL-True-localhost-builtin] 
[gw1] [ 71%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_NONE-True-*.localhost-builtin] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_NONE-True-*.localhost-pyopenssl] 
[gw3] [ 72%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_NONE-False-localhost-builtin] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_NONE-False-localhost-pyopenssl] 
[gw2] [ 73%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_NONE-True-not_localhost-builtin] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_NONE-True-not_localhost-pyopenssl] 
[gw0] [ 74%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_OPTIONAL-True-localhost-builtin] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_OPTIONAL-True-localhost-pyopenssl] 
[gw1] [ 75%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_NONE-True-*.localhost-pyopenssl] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_OPTIONAL-True-127.0.0.1-builtin] 
[gw0] [ 75%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_OPTIONAL-True-localhost-pyopenssl] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_OPTIONAL-True-not_localhost-builtin] 
[gw3] [ 76%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_NONE-False-localhost-pyopenssl] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_OPTIONAL-True-*.localhost-builtin] 
[gw2] [ 77%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_NONE-True-not_localhost-pyopenssl] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_OPTIONAL-True-*.localhost-pyopenssl] 
[gw1] [ 78%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_OPTIONAL-True-127.0.0.1-builtin] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_OPTIONAL-True-127.0.0.1-pyopenssl] 
[gw0] [ 79%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_OPTIONAL-True-not_localhost-builtin] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_OPTIONAL-True-not_localhost-pyopenssl] 
[gw1] [ 80%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_OPTIONAL-True-127.0.0.1-pyopenssl] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-True-localhost-builtin] 
[gw3] [ 81%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_OPTIONAL-True-*.localhost-builtin] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_OPTIONAL-False-localhost-builtin] 
[gw2] [ 82%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_OPTIONAL-True-*.localhost-pyopenssl] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_OPTIONAL-False-localhost-pyopenssl] 
[gw1] [ 83%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-True-localhost-builtin] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-True-127.0.0.1-builtin] 
[gw0] [ 83%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_OPTIONAL-True-not_localhost-pyopenssl] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-True-localhost-pyopenssl] 
[gw2] [ 84%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_OPTIONAL-False-localhost-pyopenssl] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-True-*.localhost-builtin] 
[gw2] [ 85%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-True-*.localhost-builtin] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-True-not_localhost-pyopenssl] 
[gw0] [ 86%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-True-localhost-pyopenssl] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-True-not_localhost-builtin] 
[gw1] [ 87%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-True-127.0.0.1-builtin] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-True-*.localhost-pyopenssl] 
[gw0] [ 88%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-True-not_localhost-builtin] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-False-localhost-pyopenssl] 
[gw3] [ 89%] FAILED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_OPTIONAL-False-localhost-builtin] 
[gw1] [ 90%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-True-*.localhost-pyopenssl] 
[gw2] [ 91%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-True-not_localhost-pyopenssl] 
cheroot/test/test_ssl.py::test_https_over_http_error[0.0.0.0] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-True-127.0.0.1-pyopenssl] 
cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-False-localhost-builtin] 
[gw1] [ 91%] PASSED cheroot/test/test_ssl.py::test_https_over_http_error[0.0.0.0] 
cheroot/test/test_ssl.py::test_http_over_https_error[0.0.0.0-pyopenssl] 
[gw1] [ 92%] PASSED cheroot/test/test_ssl.py::test_http_over_https_error[0.0.0.0-pyopenssl] 
cheroot/test/test_ssl.py::test_http_over_https_error[::-pyopenssl] 
[gw1] [ 93%] PASSED cheroot/test/test_ssl.py::test_http_over_https_error[::-pyopenssl] 
cheroot/test/webtest.py::cheroot.test.webtest.strip_netloc 
[gw1] [ 94%] PASSED cheroot/test/webtest.py::cheroot.test.webtest.strip_netloc 
[gw3] [ 95%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-True-127.0.0.1-pyopenssl] 
cheroot/test/test_ssl.py::test_http_over_https_error[0.0.0.0-builtin] 
[gw0] [ 96%] PASSED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-False-localhost-pyopenssl] 
cheroot/test/test_ssl.py::test_https_over_http_error[::] 
[gw0] [ 97%] PASSED cheroot/test/test_ssl.py::test_https_over_http_error[::] 
[gw2] [ 98%] FAILED cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-False-localhost-builtin] 
cheroot/test/test_ssl.py::test_http_over_https_error[::-builtin] 
[gw3] [ 99%] PASSED cheroot/test/test_ssl.py::test_http_over_https_error[0.0.0.0-builtin] 
[gw2] [100%] PASSED cheroot/test/test_ssl.py::test_http_over_https_error[::-builtin] 

=================================== FAILURES ===================================
____ test_tls_client_auth[VerifyMode.CERT_OPTIONAL-False-localhost-builtin] ____
[gw3] darwin -- Python 3.8.0 /Users/jaraco/code/public/cherrypy/cheroot/.tox/python/bin/python

self = <urllib3.contrib.pyopenssl.WrappedSocket object at 0x10d4ac6d0>
data = b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63830\r\nUser-Agent: python-requests/2.22.0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n'

    def _send_until_done(self, data):
        while True:
            try:
>               return self.connection.send(data)

data       = (b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63830\r\nUser-Agent: python-requests/2.22.'
 b'0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-ali'
 b've\r\n\r\n')
self       = <urllib3.contrib.pyopenssl.WrappedSocket object at 0x10d4ac6d0>

.tox/python/lib/python3.8/site-packages/urllib3/contrib/pyopenssl.py:340: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <OpenSSL.SSL.Connection object at 0x10d4ac760>
buf = b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63830\r\nUser-Agent: python-requests/2.22.0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n'
flags = 0

    def send(self, buf, flags=0):
        """
        Send data on the connection. NOTE: If you get one of the WantRead,
        WantWrite or WantX509Lookup exceptions on this, you have to call the
        method again with the SAME buffer.
    
        :param buf: The string, buffer or memoryview to send
        :param flags: (optional) Included for compatibility with the socket
                      API, the value is ignored
        :return: The number of bytes written
        """
        # Backward compatibility
        buf = _text_to_bytes_and_warn("buf", buf)
    
        if isinstance(buf, memoryview):
            buf = buf.tobytes()
        if isinstance(buf, _buffer):
            buf = str(buf)
        if not isinstance(buf, bytes):
            raise TypeError("data must be a memoryview, buffer or byte string")
        if len(buf) > 2147483647:
            raise ValueError("Cannot send more than 2**31-1 bytes at once.")
    
        result = _lib.SSL_write(self._ssl, buf, len(buf))
>       self._raise_ssl_error(self._ssl, result)

buf        = (b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63830\r\nUser-Agent: python-requests/2.22.'
 b'0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-ali'
 b've\r\n\r\n')
flags      = 0
result     = -1
self       = <OpenSSL.SSL.Connection object at 0x10d4ac760>

.tox/python/lib/python3.8/site-packages/OpenSSL/SSL.py:1737: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <OpenSSL.SSL.Connection object at 0x10d4ac760>
ssl = <cdata 'SSL *' 0x7ffce710fe00>, result = -1

    def _raise_ssl_error(self, ssl, result):
        if self._context._verify_helper is not None:
            self._context._verify_helper.raise_if_problem()
        if self._context._npn_advertise_helper is not None:
            self._context._npn_advertise_helper.raise_if_problem()
        if self._context._npn_select_helper is not None:
            self._context._npn_select_helper.raise_if_problem()
        if self._context._alpn_select_helper is not None:
            self._context._alpn_select_helper.raise_if_problem()
        if self._context._ocsp_helper is not None:
            self._context._ocsp_helper.raise_if_problem()
    
        error = _lib.SSL_get_error(ssl, result)
        if error == _lib.SSL_ERROR_WANT_READ:
            raise WantReadError()
        elif error == _lib.SSL_ERROR_WANT_WRITE:
            raise WantWriteError()
        elif error == _lib.SSL_ERROR_ZERO_RETURN:
            raise ZeroReturnError()
        elif error == _lib.SSL_ERROR_WANT_X509_LOOKUP:
            # TODO: This is untested.
            raise WantX509LookupError()
        elif error == _lib.SSL_ERROR_SYSCALL:
            if _lib.ERR_peek_error() == 0:
                if result < 0:
                    if platform == "win32":
                        errno = _ffi.getwinerror()[0]
                    else:
                        errno = _ffi.errno
    
                    if errno != 0:
>                       raise SysCallError(errno, errorcode.get(errno))
E                       OpenSSL.SSL.SysCallError: (32, 'EPIPE')

errno      = 32
error      = 5
result     = -1
self       = <OpenSSL.SSL.Connection object at 0x10d4ac760>
ssl        = <cdata 'SSL *' 0x7ffce710fe00>

.tox/python/lib/python3.8/site-packages/OpenSSL/SSL.py:1639: SysCallError

During handling of the above exception, another exception occurred:

self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x10d492820>
method = 'GET', url = '/', body = None
headers = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = <urllib3.util.timeout.Timeout object at 0x10d4926d0>
pool_timeout = None, release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}, conn = None
release_this_conn = True, err = None, clean_exit = False
timeout_obj = <urllib3.util.timeout.Timeout object at 0x10d4c3ee0>
is_new_proxy_conn = False

    def urlopen(
        self,
        method,
        url,
        body=None,
        headers=None,
        retries=None,
        redirect=True,
        assert_same_host=True,
        timeout=_Default,
        pool_timeout=None,
        release_conn=None,
        chunked=False,
        body_pos=None,
        **response_kw
    ):
        """
        Get a connection from the pool and perform an HTTP request. This is the
        lowest level call for making a request, so you'll need to specify all
        the raw details.
    
        .. note::
    
           More commonly, it's appropriate to use a convenience method provided
           by :class:`.RequestMethods`, such as :meth:`request`.
    
        .. note::
    
           `release_conn` will only behave as expected if
           `preload_content=False` because we want to make
           `preload_content=False` the default behaviour someday soon without
           breaking backwards compatibility.
    
        :param method:
            HTTP request method (such as GET, POST, PUT, etc.)
    
        :param body:
            Data to send in the request body (useful for creating
            POST requests, see HTTPConnectionPool.post_url for
            more convenience).
    
        :param headers:
            Dictionary of custom headers to send, such as User-Agent,
            If-None-Match, etc. If None, pool headers are used. If provided,
            these headers completely replace any pool-specific headers.
    
        :param retries:
            Configure the number of retries to allow before raising a
            :class:`~urllib3.exceptions.MaxRetryError` exception.
    
            Pass ``None`` to retry until you receive a response. Pass a
            :class:`~urllib3.util.retry.Retry` object for fine-grained control
            over different types of retries.
            Pass an integer number to retry connection errors that many times,
            but no other types of errors. Pass zero to never retry.
    
            If ``False``, then retries are disabled and any exception is raised
            immediately. Also, instead of raising a MaxRetryError on redirects,
            the redirect response will be returned.
    
        :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
        :param redirect:
            If True, automatically handle redirects (status codes 301, 302,
            303, 307, 308). Each redirect counts as a retry. Disabling retries
            will disable redirect, too.
    
        :param assert_same_host:
            If ``True``, will make sure that the host of the pool requests is
            consistent else will raise HostChangedError. When False, you can
            use the pool on an HTTP proxy and request foreign hosts.
    
        :param timeout:
            If specified, overrides the default timeout for this one
            request. It may be a float (in seconds) or an instance of
            :class:`urllib3.util.Timeout`.
    
        :param pool_timeout:
            If set and the pool is set to block=True, then this method will
            block for ``pool_timeout`` seconds and raise EmptyPoolError if no
            connection is available within the time period.
    
        :param release_conn:
            If False, then the urlopen call will not release the connection
            back into the pool once a response is received (but will release if
            you read the entire contents of the response such as when
            `preload_content=True`). This is useful if you're not preloading
            the response's content immediately. You will need to call
            ``r.release_conn()`` on the response ``r`` to return the connection
            back into the pool. If None, it takes the value of
            ``response_kw.get('preload_content', True)``.
    
        :param chunked:
            If True, urllib3 will send the body using chunked transfer
            encoding. Otherwise, urllib3 will send the body using the standard
            content-length form. Defaults to False.
    
        :param int body_pos:
            Position to seek to in file-like body in the event of a retry or
            redirect. Typically this won't need to be set because urllib3 will
            auto-populate the value when needed.
    
        :param \\**response_kw:
            Additional parameters are passed to
            :meth:`urllib3.response.HTTPResponse.from_httplib`
        """
        if headers is None:
            headers = self.headers
    
        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
    
        if release_conn is None:
            release_conn = response_kw.get("preload_content", True)
    
        # Check host
        if assert_same_host and not self.is_same_host(url):
            raise HostChangedError(self, url, retries)
    
        # Ensure that the URL we're connecting to is properly encoded
        if url.startswith("/"):
            url = six.ensure_str(_encode_target(url))
        else:
            url = six.ensure_str(parse_url(url).url)
    
        conn = None
    
        # Track whether `conn` needs to be released before
        # returning/raising/recursing. Update this variable if necessary, and
        # leave `release_conn` constant throughout the function. That way, if
        # the function recurses, the original value of `release_conn` will be
        # passed down into the recursive call, and its value will be respected.
        #
        # See issue #651 [1] for details.
        #
        # [1] <https://github.com/shazow/urllib3/issues/651>
        release_this_conn = release_conn
    
        # Merge the proxy headers. Only do this in HTTP. We have to copy the
        # headers dict so we can safely change it without those changes being
        # reflected in anyone else's copy.
        if self.scheme == "http":
            headers = headers.copy()
            headers.update(self.proxy_headers)
    
        # Must keep the exception bound to a separate variable or else Python 3
        # complains about UnboundLocalError.
        err = None
    
        # Keep track of whether we cleanly exited the except block. This
        # ensures we do proper cleanup in finally.
        clean_exit = False
    
        # Rewind body position, if needed. Record current position
        # for future rewinds in the event of a redirect/retry.
        body_pos = set_file_position(body, body_pos)
    
        try:
            # Request a connection from the queue.
            timeout_obj = self._get_timeout(timeout)
            conn = self._get_conn(timeout=pool_timeout)
    
            conn.timeout = timeout_obj.connect_timeout
    
            is_new_proxy_conn = self.proxy is not None and not getattr(
                conn, "sock", None
            )
            if is_new_proxy_conn:
                self._prepare_proxy(conn)
    
            # Make the request on the httplib connection object.
>           httplib_response = self._make_request(
                conn,
                method,
                url,
                timeout=timeout_obj,
                body=body,
                headers=headers,
                chunked=chunked,
            )

assert_same_host = False
body       = None
body_pos   = None
chunked    = False
clean_exit = False
conn       = None
err        = None
headers    = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
is_new_proxy_conn = False
method     = 'GET'
pool_timeout = None
redirect   = False
release_conn = False
release_this_conn = True
response_kw = {'decode_content': False, 'preload_content': False}
retries    = Retry(total=0, connect=None, read=False, redirect=None, status=None)
self       = <urllib3.connectionpool.HTTPSConnectionPool object at 0x10d492820>
timeout    = <urllib3.util.timeout.Timeout object at 0x10d4926d0>
timeout_obj = <urllib3.util.timeout.Timeout object at 0x10d4c3ee0>
url        = '/'

.tox/python/lib/python3.8/site-packages/urllib3/connectionpool.py:665: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x10d492820>
conn = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>
method = 'GET', url = '/'
timeout = <urllib3.util.timeout.Timeout object at 0x10d4c3ee0>, chunked = False
httplib_request_kw = {'body': None, 'headers': {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}}
timeout_obj = <urllib3.util.timeout.Timeout object at 0x10d4c3d90>

    def _make_request(
        self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
    ):
        """
        Perform a request on a given urllib connection object taken from our
        pool.
    
        :param conn:
            a connection from one of our connection pools
    
        :param timeout:
            Socket timeout in seconds for the request. This can be a
            float or integer, which will set the same timeout value for
            the socket connect and the socket read, or an instance of
            :class:`urllib3.util.Timeout`, which gives you more fine-grained
            control over your timeouts.
        """
        self.num_requests += 1
    
        timeout_obj = self._get_timeout(timeout)
        timeout_obj.start_connect()
        conn.timeout = timeout_obj.connect_timeout
    
        # Trigger any extra validation we need to do.
        try:
            self._validate_conn(conn)
        except (SocketTimeout, BaseSSLError) as e:
            # Py2 raises this as a BaseSSLError, Py3 raises it as socket timeout.
            self._raise_timeout(err=e, url=url, timeout_value=conn.timeout)
            raise
    
        # conn.request() calls httplib.*.request, not the method in
        # urllib3.request. It also calls makefile (recv) on the socket.
        if chunked:
            conn.request_chunked(method, url, **httplib_request_kw)
        else:
>           conn.request(method, url, **httplib_request_kw)

chunked    = False
conn       = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>
httplib_request_kw = {'body': None,
 'headers': {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}}
method     = 'GET'
self       = <urllib3.connectionpool.HTTPSConnectionPool object at 0x10d492820>
timeout    = <urllib3.util.timeout.Timeout object at 0x10d4c3ee0>
timeout_obj = <urllib3.util.timeout.Timeout object at 0x10d4c3d90>
url        = '/'

.tox/python/lib/python3.8/site-packages/urllib3/connectionpool.py:387: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>
method = 'GET', url = '/', body = None
headers = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}

    def request(self, method, url, body=None, headers={}, *,
                encode_chunked=False):
        """Send a complete request to the server."""
>       self._send_request(method, url, body, headers, encode_chunked)

body       = None
encode_chunked = False
headers    = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
method     = 'GET'
self       = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>
url        = '/'

/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py:1230: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>
method = 'GET', url = '/', body = None
headers = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
encode_chunked = False

    def _send_request(self, method, url, body, headers, encode_chunked):
        # Honor explicitly requested Host: and Accept-Encoding: headers.
        header_names = frozenset(k.lower() for k in headers)
        skips = {}
        if 'host' in header_names:
            skips['skip_host'] = 1
        if 'accept-encoding' in header_names:
            skips['skip_accept_encoding'] = 1
    
        self.putrequest(method, url, **skips)
    
        # chunked encoding will happen if HTTP/1.1 is used and either
        # the caller passes encode_chunked=True or the following
        # conditions hold:
        # 1. content-length has not been explicitly set
        # 2. the body is a file or iterable, but not a str or bytes-like
        # 3. Transfer-Encoding has NOT been explicitly set by the caller
    
        if 'content-length' not in header_names:
            # only chunk body if not explicitly set for backwards
            # compatibility, assuming the client code is already handling the
            # chunking
            if 'transfer-encoding' not in header_names:
                # if content-length cannot be automatically determined, fall
                # back to chunked encoding
                encode_chunked = False
                content_length = self._get_content_length(body, method)
                if content_length is None:
                    if body is not None:
                        if self.debuglevel > 0:
                            print('Unable to determine size of %r' % body)
                        encode_chunked = True
                        self.putheader('Transfer-Encoding', 'chunked')
                else:
                    self.putheader('Content-Length', str(content_length))
        else:
            encode_chunked = False
    
        for hdr, value in headers.items():
            self.putheader(hdr, value)
        if isinstance(body, str):
            # RFC 2616 Section 3.7.1 says that text default has a
            # default charset of iso-8859-1.
            body = _encode(body, 'body')
>       self.endheaders(body, encode_chunked=encode_chunked)

body       = None
content_length = None
encode_chunked = False
hdr        = 'Connection'
header_names = frozenset({'connection', 'user-agent', 'accept', 'accept-encoding'})
headers    = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
method     = 'GET'
self       = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>
skips      = {'skip_accept_encoding': 1}
url        = '/'
value      = 'keep-alive'

/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py:1276: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>
message_body = None

    def endheaders(self, message_body=None, *, encode_chunked=False):
        """Indicate that the last header line has been sent to the server.
    
        This method sends the request to the server.  The optional message_body
        argument can be used to pass a message body associated with the
        request.
        """
        if self.__state == _CS_REQ_STARTED:
            self.__state = _CS_REQ_SENT
        else:
            raise CannotSendHeader()
>       self._send_output(message_body, encode_chunked=encode_chunked)

encode_chunked = False
message_body = None
self       = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>

/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py:1225: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>
message_body = None, encode_chunked = False

    def _send_output(self, message_body=None, encode_chunked=False):
        """Send the currently buffered request and clear the buffer.
    
        Appends an extra \\r\\n to the buffer.
        A message_body may be specified, to be appended to the request.
        """
        self._buffer.extend((b"", b""))
        msg = b"\r\n".join(self._buffer)
        del self._buffer[:]
>       self.send(msg)

encode_chunked = False
message_body = None
msg        = (b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63830\r\nUser-Agent: python-requests/2.22.'
 b'0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-ali'
 b've\r\n\r\n')
self       = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>

/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py:1004: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>
data = b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63830\r\nUser-Agent: python-requests/2.22.0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n'

    def send(self, data):
        """Send `data' to the server.
        ``data`` can be a string object, a bytes object, an array object, a
        file-like object that supports a .read() method, or an iterable object.
        """
    
        if self.sock is None:
            if self.auto_open:
                self.connect()
            else:
                raise NotConnected()
    
        if self.debuglevel > 0:
            print("send:", repr(data))
        if hasattr(data, "read") :
            if self.debuglevel > 0:
                print("sendIng a read()able")
            encode = self._is_textIO(data)
            if encode and self.debuglevel > 0:
                print("encoding file using iso-8859-1")
            while 1:
                datablock = data.read(self.blocksize)
                if not datablock:
                    break
                if encode:
                    datablock = datablock.encode("iso-8859-1")
                self.sock.sendall(datablock)
            return
        try:
>           self.sock.sendall(data)

data       = (b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63830\r\nUser-Agent: python-requests/2.22.'
 b'0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-ali'
 b've\r\n\r\n')
self       = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>

/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py:965: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.contrib.pyopenssl.WrappedSocket object at 0x10d4ac6d0>
data = b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63830\r\nUser-Agent: python-requests/2.22.0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n'

    def sendall(self, data):
        total_sent = 0
        while total_sent < len(data):
>           sent = self._send_until_done(
                data[total_sent : total_sent + SSL_WRITE_BLOCKSIZE]
            )

data       = (b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63830\r\nUser-Agent: python-requests/2.22.'
 b'0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-ali'
 b've\r\n\r\n')
self       = <urllib3.contrib.pyopenssl.WrappedSocket object at 0x10d4ac6d0>
total_sent = 0

.tox/python/lib/python3.8/site-packages/urllib3/contrib/pyopenssl.py:351: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.contrib.pyopenssl.WrappedSocket object at 0x10d4ac6d0>
data = b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63830\r\nUser-Agent: python-requests/2.22.0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n'

    def _send_until_done(self, data):
        while True:
            try:
                return self.connection.send(data)
            except OpenSSL.SSL.WantWriteError:
                if not util.wait_for_write(self.socket, self.socket.gettimeout()):
                    raise timeout()
                continue
            except OpenSSL.SSL.SysCallError as e:
>               raise SocketError(str(e))
E               OSError: (32, 'EPIPE')

data       = (b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63830\r\nUser-Agent: python-requests/2.22.'
 b'0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-ali'
 b've\r\n\r\n')
self       = <urllib3.contrib.pyopenssl.WrappedSocket object at 0x10d4ac6d0>

.tox/python/lib/python3.8/site-packages/urllib3/contrib/pyopenssl.py:346: OSError

During handling of the above exception, another exception occurred:

self = <requests.adapters.HTTPAdapter object at 0x10d475df0>
request = <PreparedRequest [GET]>, stream = False
timeout = <urllib3.util.timeout.Timeout object at 0x10d4926d0>
verify = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmprbkju_i3.pem'
cert = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmpgv909sa6.pem'
proxies = OrderedDict()

    def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
        """Sends PreparedRequest object. Returns Response object.
    
        :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
        :param stream: (optional) Whether to stream the request content.
        :param timeout: (optional) How long to wait for the server to send
            data before giving up, as a float, or a :ref:`(connect timeout,
            read timeout) <timeouts>` tuple.
        :type timeout: float or tuple or urllib3 Timeout object
        :param verify: (optional) Either a boolean, in which case it controls whether
            we verify the server's TLS certificate, or a string, in which case it
            must be a path to a CA bundle to use
        :param cert: (optional) Any user-provided SSL certificate to be trusted.
        :param proxies: (optional) The proxies dictionary to apply to the request.
        :rtype: requests.Response
        """
    
        try:
            conn = self.get_connection(request.url, proxies)
        except LocationValueError as e:
            raise InvalidURL(e, request=request)
    
        self.cert_verify(conn, request.url, verify, cert)
        url = self.request_url(request, proxies)
        self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
    
        chunked = not (request.body is None or 'Content-Length' in request.headers)
    
        if isinstance(timeout, tuple):
            try:
                connect, read = timeout
                timeout = TimeoutSauce(connect=connect, read=read)
            except ValueError as e:
                # this may raise a string formatting error.
                err = ("Invalid timeout {}. Pass a (connect, read) "
                       "timeout tuple, or a single float to set "
                       "both timeouts to the same value".format(timeout))
                raise ValueError(err)
        elif isinstance(timeout, TimeoutSauce):
            pass
        else:
            timeout = TimeoutSauce(connect=timeout, read=timeout)
    
        try:
            if not chunked:
>               resp = conn.urlopen(
                    method=request.method,
                    url=url,
                    body=request.body,
                    headers=request.headers,
                    redirect=False,
                    assert_same_host=False,
                    preload_content=False,
                    decode_content=False,
                    retries=self.max_retries,
                    timeout=timeout
                )

cert       = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmpgv909sa6.pem'
chunked    = False
conn       = <urllib3.connectionpool.HTTPSConnectionPool object at 0x10d492820>
proxies    = OrderedDict()
request    = <PreparedRequest [GET]>
self       = <requests.adapters.HTTPAdapter object at 0x10d475df0>
stream     = False
timeout    = <urllib3.util.timeout.Timeout object at 0x10d4926d0>
url        = '/'
verify     = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmprbkju_i3.pem'

.tox/python/lib/python3.8/site-packages/requests/adapters.py:439: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x10d492820>
method = 'GET', url = '/', body = None
headers = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = <urllib3.util.timeout.Timeout object at 0x10d4926d0>
pool_timeout = None, release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}, conn = None
release_this_conn = True, err = None, clean_exit = False
timeout_obj = <urllib3.util.timeout.Timeout object at 0x10d4c3ee0>
is_new_proxy_conn = False

    def urlopen(
        self,
        method,
        url,
        body=None,
        headers=None,
        retries=None,
        redirect=True,
        assert_same_host=True,
        timeout=_Default,
        pool_timeout=None,
        release_conn=None,
        chunked=False,
        body_pos=None,
        **response_kw
    ):
        """
        Get a connection from the pool and perform an HTTP request. This is the
        lowest level call for making a request, so you'll need to specify all
        the raw details.
    
        .. note::
    
           More commonly, it's appropriate to use a convenience method provided
           by :class:`.RequestMethods`, such as :meth:`request`.
    
        .. note::
    
           `release_conn` will only behave as expected if
           `preload_content=False` because we want to make
           `preload_content=False` the default behaviour someday soon without
           breaking backwards compatibility.
    
        :param method:
            HTTP request method (such as GET, POST, PUT, etc.)
    
        :param body:
            Data to send in the request body (useful for creating
            POST requests, see HTTPConnectionPool.post_url for
            more convenience).
    
        :param headers:
            Dictionary of custom headers to send, such as User-Agent,
            If-None-Match, etc. If None, pool headers are used. If provided,
            these headers completely replace any pool-specific headers.
    
        :param retries:
            Configure the number of retries to allow before raising a
            :class:`~urllib3.exceptions.MaxRetryError` exception.
    
            Pass ``None`` to retry until you receive a response. Pass a
            :class:`~urllib3.util.retry.Retry` object for fine-grained control
            over different types of retries.
            Pass an integer number to retry connection errors that many times,
            but no other types of errors. Pass zero to never retry.
    
            If ``False``, then retries are disabled and any exception is raised
            immediately. Also, instead of raising a MaxRetryError on redirects,
            the redirect response will be returned.
    
        :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
        :param redirect:
            If True, automatically handle redirects (status codes 301, 302,
            303, 307, 308). Each redirect counts as a retry. Disabling retries
            will disable redirect, too.
    
        :param assert_same_host:
            If ``True``, will make sure that the host of the pool requests is
            consistent else will raise HostChangedError. When False, you can
            use the pool on an HTTP proxy and request foreign hosts.
    
        :param timeout:
            If specified, overrides the default timeout for this one
            request. It may be a float (in seconds) or an instance of
            :class:`urllib3.util.Timeout`.
    
        :param pool_timeout:
            If set and the pool is set to block=True, then this method will
            block for ``pool_timeout`` seconds and raise EmptyPoolError if no
            connection is available within the time period.
    
        :param release_conn:
            If False, then the urlopen call will not release the connection
            back into the pool once a response is received (but will release if
            you read the entire contents of the response such as when
            `preload_content=True`). This is useful if you're not preloading
            the response's content immediately. You will need to call
            ``r.release_conn()`` on the response ``r`` to return the connection
            back into the pool. If None, it takes the value of
            ``response_kw.get('preload_content', True)``.
    
        :param chunked:
            If True, urllib3 will send the body using chunked transfer
            encoding. Otherwise, urllib3 will send the body using the standard
            content-length form. Defaults to False.
    
        :param int body_pos:
            Position to seek to in file-like body in the event of a retry or
            redirect. Typically this won't need to be set because urllib3 will
            auto-populate the value when needed.
    
        :param \\**response_kw:
            Additional parameters are passed to
            :meth:`urllib3.response.HTTPResponse.from_httplib`
        """
        if headers is None:
            headers = self.headers
    
        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
    
        if release_conn is None:
            release_conn = response_kw.get("preload_content", True)
    
        # Check host
        if assert_same_host and not self.is_same_host(url):
            raise HostChangedError(self, url, retries)
    
        # Ensure that the URL we're connecting to is properly encoded
        if url.startswith("/"):
            url = six.ensure_str(_encode_target(url))
        else:
            url = six.ensure_str(parse_url(url).url)
    
        conn = None
    
        # Track whether `conn` needs to be released before
        # returning/raising/recursing. Update this variable if necessary, and
        # leave `release_conn` constant throughout the function. That way, if
        # the function recurses, the original value of `release_conn` will be
        # passed down into the recursive call, and its value will be respected.
        #
        # See issue #651 [1] for details.
        #
        # [1] <https://github.com/shazow/urllib3/issues/651>
        release_this_conn = release_conn
    
        # Merge the proxy headers. Only do this in HTTP. We have to copy the
        # headers dict so we can safely change it without those changes being
        # reflected in anyone else's copy.
        if self.scheme == "http":
            headers = headers.copy()
            headers.update(self.proxy_headers)
    
        # Must keep the exception bound to a separate variable or else Python 3
        # complains about UnboundLocalError.
        err = None
    
        # Keep track of whether we cleanly exited the except block. This
        # ensures we do proper cleanup in finally.
        clean_exit = False
    
        # Rewind body position, if needed. Record current position
        # for future rewinds in the event of a redirect/retry.
        body_pos = set_file_position(body, body_pos)
    
        try:
            # Request a connection from the queue.
            timeout_obj = self._get_timeout(timeout)
            conn = self._get_conn(timeout=pool_timeout)
    
            conn.timeout = timeout_obj.connect_timeout
    
            is_new_proxy_conn = self.proxy is not None and not getattr(
                conn, "sock", None
            )
            if is_new_proxy_conn:
                self._prepare_proxy(conn)
    
            # Make the request on the httplib connection object.
            httplib_response = self._make_request(
                conn,
                method,
                url,
                timeout=timeout_obj,
                body=body,
                headers=headers,
                chunked=chunked,
            )
    
            # If we're going to release the connection in ``finally:``, then
            # the response doesn't need to know about the connection. Otherwise
            # it will also try to release it and we'll have a double-release
            # mess.
            response_conn = conn if not release_conn else None
    
            # Pass method to Response for length checking
            response_kw["request_method"] = method
    
            # Import httplib's response into our own wrapper object
            response = self.ResponseCls.from_httplib(
                httplib_response,
                pool=self,
                connection=response_conn,
                retries=retries,
                **response_kw
            )
    
            # Everything went great!
            clean_exit = True
    
        except queue.Empty:
            # Timed out by queue.
            raise EmptyPoolError(self, "No pool connections are available.")
    
        except (
            TimeoutError,
            HTTPException,
            SocketError,
            ProtocolError,
            BaseSSLError,
            SSLError,
            CertificateError,
        ) as e:
            # Discard the connection for these exceptions. It will be
            # replaced during the next _get_conn() call.
            clean_exit = False
            if isinstance(e, (BaseSSLError, CertificateError)):
                e = SSLError(e)
            elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
                e = ProxyError("Cannot connect to proxy.", e)
            elif isinstance(e, (SocketError, HTTPException)):
                e = ProtocolError("Connection aborted.", e)
    
>           retries = retries.increment(
                method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
            )

assert_same_host = False
body       = None
body_pos   = None
chunked    = False
clean_exit = False
conn       = None
err        = None
headers    = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
is_new_proxy_conn = False
method     = 'GET'
pool_timeout = None
redirect   = False
release_conn = False
release_this_conn = True
response_kw = {'decode_content': False, 'preload_content': False}
retries    = Retry(total=0, connect=None, read=False, redirect=None, status=None)
self       = <urllib3.connectionpool.HTTPSConnectionPool object at 0x10d492820>
timeout    = <urllib3.util.timeout.Timeout object at 0x10d4926d0>
timeout_obj = <urllib3.util.timeout.Timeout object at 0x10d4c3ee0>
url        = '/'

.tox/python/lib/python3.8/site-packages/urllib3/connectionpool.py:719: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
method = 'GET', url = '/', response = None
error = ProtocolError('Connection aborted.', OSError("(32, 'EPIPE')"))
_pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x10d492820>
_stacktrace = <traceback object at 0x10d530fc0>

    def increment(
        self,
        method=None,
        url=None,
        response=None,
        error=None,
        _pool=None,
        _stacktrace=None,
    ):
        """ Return a new Retry object with incremented retry counters.
    
        :param response: A response object, or None, if the server did not
            return a response.
        :type response: :class:`~urllib3.response.HTTPResponse`
        :param Exception error: An error encountered during the request, or
            None if the response was received successfully.
    
        :return: A new ``Retry`` object.
        """
        if self.total is False and error:
            # Disabled, indicate to re-raise the error.
            raise six.reraise(type(error), error, _stacktrace)
    
        total = self.total
        if total is not None:
            total -= 1
    
        connect = self.connect
        read = self.read
        redirect = self.redirect
        status_count = self.status
        cause = "unknown"
        status = None
        redirect_location = None
    
        if error and self._is_connection_error(error):
            # Connect retry?
            if connect is False:
                raise six.reraise(type(error), error, _stacktrace)
            elif connect is not None:
                connect -= 1
    
        elif error and self._is_read_error(error):
            # Read retry?
            if read is False or not self._is_method_retryable(method):
>               raise six.reraise(type(error), error, _stacktrace)

_pool      = <urllib3.connectionpool.HTTPSConnectionPool object at 0x10d492820>
_stacktrace = <traceback object at 0x10d530fc0>
cause      = 'unknown'
connect    = None
error      = ProtocolError('Connection aborted.', OSError("(32, 'EPIPE')"))
method     = 'GET'
read       = False
redirect   = None
redirect_location = None
response   = None
self       = Retry(total=0, connect=None, read=False, redirect=None, status=None)
status     = None
status_count = None
total      = -1
url        = '/'

.tox/python/lib/python3.8/site-packages/urllib3/util/retry.py:400: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

tp = <class 'urllib3.exceptions.ProtocolError'>, value = None, tb = None

    def reraise(tp, value, tb=None):
        try:
            if value is None:
                value = tp()
            if value.__traceback__ is not tb:
>               raise value.with_traceback(tb)

tb         = None
tp         = <class 'urllib3.exceptions.ProtocolError'>
value      = None

.tox/python/lib/python3.8/site-packages/urllib3/packages/six.py:734: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x10d492820>
method = 'GET', url = '/', body = None
headers = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = <urllib3.util.timeout.Timeout object at 0x10d4926d0>
pool_timeout = None, release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}, conn = None
release_this_conn = True, err = None, clean_exit = False
timeout_obj = <urllib3.util.timeout.Timeout object at 0x10d4c3ee0>
is_new_proxy_conn = False

    def urlopen(
        self,
        method,
        url,
        body=None,
        headers=None,
        retries=None,
        redirect=True,
        assert_same_host=True,
        timeout=_Default,
        pool_timeout=None,
        release_conn=None,
        chunked=False,
        body_pos=None,
        **response_kw
    ):
        """
        Get a connection from the pool and perform an HTTP request. This is the
        lowest level call for making a request, so you'll need to specify all
        the raw details.
    
        .. note::
    
           More commonly, it's appropriate to use a convenience method provided
           by :class:`.RequestMethods`, such as :meth:`request`.
    
        .. note::
    
           `release_conn` will only behave as expected if
           `preload_content=False` because we want to make
           `preload_content=False` the default behaviour someday soon without
           breaking backwards compatibility.
    
        :param method:
            HTTP request method (such as GET, POST, PUT, etc.)
    
        :param body:
            Data to send in the request body (useful for creating
            POST requests, see HTTPConnectionPool.post_url for
            more convenience).
    
        :param headers:
            Dictionary of custom headers to send, such as User-Agent,
            If-None-Match, etc. If None, pool headers are used. If provided,
            these headers completely replace any pool-specific headers.
    
        :param retries:
            Configure the number of retries to allow before raising a
            :class:`~urllib3.exceptions.MaxRetryError` exception.
    
            Pass ``None`` to retry until you receive a response. Pass a
            :class:`~urllib3.util.retry.Retry` object for fine-grained control
            over different types of retries.
            Pass an integer number to retry connection errors that many times,
            but no other types of errors. Pass zero to never retry.
    
            If ``False``, then retries are disabled and any exception is raised
            immediately. Also, instead of raising a MaxRetryError on redirects,
            the redirect response will be returned.
    
        :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
        :param redirect:
            If True, automatically handle redirects (status codes 301, 302,
            303, 307, 308). Each redirect counts as a retry. Disabling retries
            will disable redirect, too.
    
        :param assert_same_host:
            If ``True``, will make sure that the host of the pool requests is
            consistent else will raise HostChangedError. When False, you can
            use the pool on an HTTP proxy and request foreign hosts.
    
        :param timeout:
            If specified, overrides the default timeout for this one
            request. It may be a float (in seconds) or an instance of
            :class:`urllib3.util.Timeout`.
    
        :param pool_timeout:
            If set and the pool is set to block=True, then this method will
            block for ``pool_timeout`` seconds and raise EmptyPoolError if no
            connection is available within the time period.
    
        :param release_conn:
            If False, then the urlopen call will not release the connection
            back into the pool once a response is received (but will release if
            you read the entire contents of the response such as when
            `preload_content=True`). This is useful if you're not preloading
            the response's content immediately. You will need to call
            ``r.release_conn()`` on the response ``r`` to return the connection
            back into the pool. If None, it takes the value of
            ``response_kw.get('preload_content', True)``.
    
        :param chunked:
            If True, urllib3 will send the body using chunked transfer
            encoding. Otherwise, urllib3 will send the body using the standard
            content-length form. Defaults to False.
    
        :param int body_pos:
            Position to seek to in file-like body in the event of a retry or
            redirect. Typically this won't need to be set because urllib3 will
            auto-populate the value when needed.
    
        :param \\**response_kw:
            Additional parameters are passed to
            :meth:`urllib3.response.HTTPResponse.from_httplib`
        """
        if headers is None:
            headers = self.headers
    
        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
    
        if release_conn is None:
            release_conn = response_kw.get("preload_content", True)
    
        # Check host
        if assert_same_host and not self.is_same_host(url):
            raise HostChangedError(self, url, retries)
    
        # Ensure that the URL we're connecting to is properly encoded
        if url.startswith("/"):
            url = six.ensure_str(_encode_target(url))
        else:
            url = six.ensure_str(parse_url(url).url)
    
        conn = None
    
        # Track whether `conn` needs to be released before
        # returning/raising/recursing. Update this variable if necessary, and
        # leave `release_conn` constant throughout the function. That way, if
        # the function recurses, the original value of `release_conn` will be
        # passed down into the recursive call, and its value will be respected.
        #
        # See issue #651 [1] for details.
        #
        # [1] <https://github.com/shazow/urllib3/issues/651>
        release_this_conn = release_conn
    
        # Merge the proxy headers. Only do this in HTTP. We have to copy the
        # headers dict so we can safely change it without those changes being
        # reflected in anyone else's copy.
        if self.scheme == "http":
            headers = headers.copy()
            headers.update(self.proxy_headers)
    
        # Must keep the exception bound to a separate variable or else Python 3
        # complains about UnboundLocalError.
        err = None
    
        # Keep track of whether we cleanly exited the except block. This
        # ensures we do proper cleanup in finally.
        clean_exit = False
    
        # Rewind body position, if needed. Record current position
        # for future rewinds in the event of a redirect/retry.
        body_pos = set_file_position(body, body_pos)
    
        try:
            # Request a connection from the queue.
            timeout_obj = self._get_timeout(timeout)
            conn = self._get_conn(timeout=pool_timeout)
    
            conn.timeout = timeout_obj.connect_timeout
    
            is_new_proxy_conn = self.proxy is not None and not getattr(
                conn, "sock", None
            )
            if is_new_proxy_conn:
                self._prepare_proxy(conn)
    
            # Make the request on the httplib connection object.
>           httplib_response = self._make_request(
                conn,
                method,
                url,
                timeout=timeout_obj,
                body=body,
                headers=headers,
                chunked=chunked,
            )

assert_same_host = False
body       = None
body_pos   = None
chunked    = False
clean_exit = False
conn       = None
err        = None
headers    = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
is_new_proxy_conn = False
method     = 'GET'
pool_timeout = None
redirect   = False
release_conn = False
release_this_conn = True
response_kw = {'decode_content': False, 'preload_content': False}
retries    = Retry(total=0, connect=None, read=False, redirect=None, status=None)
self       = <urllib3.connectionpool.HTTPSConnectionPool object at 0x10d492820>
timeout    = <urllib3.util.timeout.Timeout object at 0x10d4926d0>
timeout_obj = <urllib3.util.timeout.Timeout object at 0x10d4c3ee0>
url        = '/'

.tox/python/lib/python3.8/site-packages/urllib3/connectionpool.py:665: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x10d492820>
conn = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>
method = 'GET', url = '/'
timeout = <urllib3.util.timeout.Timeout object at 0x10d4c3ee0>, chunked = False
httplib_request_kw = {'body': None, 'headers': {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}}
timeout_obj = <urllib3.util.timeout.Timeout object at 0x10d4c3d90>

    def _make_request(
        self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
    ):
        """
        Perform a request on a given urllib connection object taken from our
        pool.
    
        :param conn:
            a connection from one of our connection pools
    
        :param timeout:
            Socket timeout in seconds for the request. This can be a
            float or integer, which will set the same timeout value for
            the socket connect and the socket read, or an instance of
            :class:`urllib3.util.Timeout`, which gives you more fine-grained
            control over your timeouts.
        """
        self.num_requests += 1
    
        timeout_obj = self._get_timeout(timeout)
        timeout_obj.start_connect()
        conn.timeout = timeout_obj.connect_timeout
    
        # Trigger any extra validation we need to do.
        try:
            self._validate_conn(conn)
        except (SocketTimeout, BaseSSLError) as e:
            # Py2 raises this as a BaseSSLError, Py3 raises it as socket timeout.
            self._raise_timeout(err=e, url=url, timeout_value=conn.timeout)
            raise
    
        # conn.request() calls httplib.*.request, not the method in
        # urllib3.request. It also calls makefile (recv) on the socket.
        if chunked:
            conn.request_chunked(method, url, **httplib_request_kw)
        else:
>           conn.request(method, url, **httplib_request_kw)

chunked    = False
conn       = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>
httplib_request_kw = {'body': None,
 'headers': {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}}
method     = 'GET'
self       = <urllib3.connectionpool.HTTPSConnectionPool object at 0x10d492820>
timeout    = <urllib3.util.timeout.Timeout object at 0x10d4c3ee0>
timeout_obj = <urllib3.util.timeout.Timeout object at 0x10d4c3d90>
url        = '/'

.tox/python/lib/python3.8/site-packages/urllib3/connectionpool.py:387: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>
method = 'GET', url = '/', body = None
headers = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}

    def request(self, method, url, body=None, headers={}, *,
                encode_chunked=False):
        """Send a complete request to the server."""
>       self._send_request(method, url, body, headers, encode_chunked)

body       = None
encode_chunked = False
headers    = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
method     = 'GET'
self       = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>
url        = '/'

/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py:1230: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>
method = 'GET', url = '/', body = None
headers = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
encode_chunked = False

    def _send_request(self, method, url, body, headers, encode_chunked):
        # Honor explicitly requested Host: and Accept-Encoding: headers.
        header_names = frozenset(k.lower() for k in headers)
        skips = {}
        if 'host' in header_names:
            skips['skip_host'] = 1
        if 'accept-encoding' in header_names:
            skips['skip_accept_encoding'] = 1
    
        self.putrequest(method, url, **skips)
    
        # chunked encoding will happen if HTTP/1.1 is used and either
        # the caller passes encode_chunked=True or the following
        # conditions hold:
        # 1. content-length has not been explicitly set
        # 2. the body is a file or iterable, but not a str or bytes-like
        # 3. Transfer-Encoding has NOT been explicitly set by the caller
    
        if 'content-length' not in header_names:
            # only chunk body if not explicitly set for backwards
            # compatibility, assuming the client code is already handling the
            # chunking
            if 'transfer-encoding' not in header_names:
                # if content-length cannot be automatically determined, fall
                # back to chunked encoding
                encode_chunked = False
                content_length = self._get_content_length(body, method)
                if content_length is None:
                    if body is not None:
                        if self.debuglevel > 0:
                            print('Unable to determine size of %r' % body)
                        encode_chunked = True
                        self.putheader('Transfer-Encoding', 'chunked')
                else:
                    self.putheader('Content-Length', str(content_length))
        else:
            encode_chunked = False
    
        for hdr, value in headers.items():
            self.putheader(hdr, value)
        if isinstance(body, str):
            # RFC 2616 Section 3.7.1 says that text default has a
            # default charset of iso-8859-1.
            body = _encode(body, 'body')
>       self.endheaders(body, encode_chunked=encode_chunked)

body       = None
content_length = None
encode_chunked = False
hdr        = 'Connection'
header_names = frozenset({'connection', 'user-agent', 'accept', 'accept-encoding'})
headers    = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
method     = 'GET'
self       = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>
skips      = {'skip_accept_encoding': 1}
url        = '/'
value      = 'keep-alive'

/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py:1276: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>
message_body = None

    def endheaders(self, message_body=None, *, encode_chunked=False):
        """Indicate that the last header line has been sent to the server.
    
        This method sends the request to the server.  The optional message_body
        argument can be used to pass a message body associated with the
        request.
        """
        if self.__state == _CS_REQ_STARTED:
            self.__state = _CS_REQ_SENT
        else:
            raise CannotSendHeader()
>       self._send_output(message_body, encode_chunked=encode_chunked)

encode_chunked = False
message_body = None
self       = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>

/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py:1225: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>
message_body = None, encode_chunked = False

    def _send_output(self, message_body=None, encode_chunked=False):
        """Send the currently buffered request and clear the buffer.
    
        Appends an extra \\r\\n to the buffer.
        A message_body may be specified, to be appended to the request.
        """
        self._buffer.extend((b"", b""))
        msg = b"\r\n".join(self._buffer)
        del self._buffer[:]
>       self.send(msg)

encode_chunked = False
message_body = None
msg        = (b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63830\r\nUser-Agent: python-requests/2.22.'
 b'0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-ali'
 b've\r\n\r\n')
self       = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>

/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py:1004: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>
data = b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63830\r\nUser-Agent: python-requests/2.22.0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n'

    def send(self, data):
        """Send `data' to the server.
        ``data`` can be a string object, a bytes object, an array object, a
        file-like object that supports a .read() method, or an iterable object.
        """
    
        if self.sock is None:
            if self.auto_open:
                self.connect()
            else:
                raise NotConnected()
    
        if self.debuglevel > 0:
            print("send:", repr(data))
        if hasattr(data, "read") :
            if self.debuglevel > 0:
                print("sendIng a read()able")
            encode = self._is_textIO(data)
            if encode and self.debuglevel > 0:
                print("encoding file using iso-8859-1")
            while 1:
                datablock = data.read(self.blocksize)
                if not datablock:
                    break
                if encode:
                    datablock = datablock.encode("iso-8859-1")
                self.sock.sendall(datablock)
            return
        try:
>           self.sock.sendall(data)

data       = (b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63830\r\nUser-Agent: python-requests/2.22.'
 b'0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-ali'
 b've\r\n\r\n')
self       = <urllib3.connection.VerifiedHTTPSConnection object at 0x10d4c3d30>

/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py:965: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.contrib.pyopenssl.WrappedSocket object at 0x10d4ac6d0>
data = b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63830\r\nUser-Agent: python-requests/2.22.0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n'

    def sendall(self, data):
        total_sent = 0
        while total_sent < len(data):
>           sent = self._send_until_done(
                data[total_sent : total_sent + SSL_WRITE_BLOCKSIZE]
            )

data       = (b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63830\r\nUser-Agent: python-requests/2.22.'
 b'0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-ali'
 b've\r\n\r\n')
self       = <urllib3.contrib.pyopenssl.WrappedSocket object at 0x10d4ac6d0>
total_sent = 0

.tox/python/lib/python3.8/site-packages/urllib3/contrib/pyopenssl.py:351: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.contrib.pyopenssl.WrappedSocket object at 0x10d4ac6d0>
data = b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63830\r\nUser-Agent: python-requests/2.22.0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n'

    def _send_until_done(self, data):
        while True:
            try:
                return self.connection.send(data)
            except OpenSSL.SSL.WantWriteError:
                if not util.wait_for_write(self.socket, self.socket.gettimeout()):
                    raise timeout()
                continue
            except OpenSSL.SSL.SysCallError as e:
>               raise SocketError(str(e))
E               urllib3.exceptions.ProtocolError: ('Connection aborted.', OSError("(32, 'EPIPE')"))

data       = (b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63830\r\nUser-Agent: python-requests/2.22.'
 b'0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-ali'
 b've\r\n\r\n')
self       = <urllib3.contrib.pyopenssl.WrappedSocket object at 0x10d4ac6d0>

.tox/python/lib/python3.8/site-packages/urllib3/contrib/pyopenssl.py:346: ProtocolError

During handling of the above exception, another exception occurred:

mocker = <pytest_mock.MockFixture object at 0x10d3384c0>
tls_http_server = <generator object tls_http_server.<locals>.start_srv at 0x10d30be40>
adapter_type = 'builtin', ca = <trustme.CA object at 0x10d4c9fa0>
tls_certificate = <trustme.LeafCert object at 0x10d2f7ca0>
tls_certificate_chain_pem_path = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmpodpxct6o.pem'
tls_certificate_private_key_pem_path = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmpw5nvnapi.pem'
tls_ca_certificate_pem_path = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmprbkju_i3.pem'
is_trusted_cert = False, tls_client_identity = 'localhost'
tls_verify_mode = <VerifyMode.CERT_OPTIONAL: 1>

    @pytest.mark.parametrize(
        'adapter_type',
        (
            'builtin',
            'pyopenssl',
        ),
    )
    @pytest.mark.parametrize(
        'is_trusted_cert,tls_client_identity',
        (
            (True, 'localhost'), (True, '127.0.0.1'),
            (True, '*.localhost'), (True, 'not_localhost'),
            (False, 'localhost'),
        ),
    )
    @pytest.mark.parametrize(
        'tls_verify_mode',
        (
            ssl.CERT_NONE,  # server shouldn't validate client cert
            ssl.CERT_OPTIONAL,  # same as CERT_REQUIRED in client mode, don't use
            ssl.CERT_REQUIRED,  # server should validate if client cert CA is OK
        ),
    )
    def test_tls_client_auth(
        # FIXME: remove twisted logic, separate tests
        mocker,
        tls_http_server, adapter_type,
        ca,
        tls_certificate,
        tls_certificate_chain_pem_path,
        tls_certificate_private_key_pem_path,
        tls_ca_certificate_pem_path,
        is_trusted_cert, tls_client_identity,
        tls_verify_mode,
    ):
        """Verify that client TLS certificate auth works correctly."""
        test_cert_rejection = (
            tls_verify_mode != ssl.CERT_NONE
            and not is_trusted_cert
        )
        interface, _host, port = _get_conn_data(ANY_INTERFACE_IPV4)
    
        client_cert_root_ca = ca if is_trusted_cert else trustme.CA()
        with mocker.mock_module.patch(
            'idna.core.ulabel',
            return_value=ntob(tls_client_identity),
        ):
            client_cert = client_cert_root_ca.issue_server_cert(
                # FIXME: change to issue_cert once new trustme is out
                ntou(tls_client_identity),
            )
            del client_cert_root_ca
    
        with client_cert.private_key_and_cert_chain_pem.tempfile() as cl_pem:
            tls_adapter_cls = get_ssl_adapter_class(name=adapter_type)
            tls_adapter = tls_adapter_cls(
                tls_certificate_chain_pem_path,
                tls_certificate_private_key_pem_path,
            )
            if adapter_type == 'pyopenssl':
                tls_adapter.context = tls_adapter.get_context()
                tls_adapter.context.set_verify(
                    _stdlib_to_openssl_verify[tls_verify_mode],
                    lambda conn, cert, errno, depth, preverify_ok: preverify_ok,
                )
            else:
                tls_adapter.context.verify_mode = tls_verify_mode
    
            ca.configure_trust(tls_adapter.context)
            tls_certificate.configure_cert(tls_adapter.context)
    
            tlshttpserver = tls_http_server.send(
                (
                    (interface, port),
                    tls_adapter,
                ),
            )
    
            interface, _host, port = _get_conn_data(tlshttpserver.bind_addr)
    
            make_https_request = functools.partial(
                requests.get,
                'https://' + interface + ':' + str(port) + '/',
    
                # Server TLS certificate verification:
                verify=tls_ca_certificate_pem_path,
    
                # Client TLS certificate verification:
                cert=cl_pem,
            )
    
            if not test_cert_rejection:
                resp = make_https_request()
                is_req_successful = resp.status_code == 200
                if (
                        not is_req_successful
                        and IS_PYOPENSSL_SSL_VERSION_1_0
                        and adapter_type == 'builtin'
                        and tls_verify_mode == ssl.CERT_REQUIRED
                        and tls_client_identity == 'localhost'
                        and is_trusted_cert
                ) or PY34:
                    pytest.xfail(
                        'OpenSSL 1.0 has problems with verifying client certs',
                    )
                assert is_req_successful
                assert resp.text == 'Hello world!'
                return
    
            expected_ssl_errors = (
                requests.exceptions.SSLError,
                OpenSSL.SSL.Error,
            ) if PY34 else (
                requests.exceptions.SSLError,
            )
            if IS_WINDOWS or IS_GITHUB_ACTIONS_WORKFLOW:
                expected_ssl_errors += requests.exceptions.ConnectionError,
            with pytest.raises(expected_ssl_errors) as ssl_err:
>               make_https_request()

_host      = '127.0.0.1'
adapter_type = 'builtin'
ca         = <trustme.CA object at 0x10d4c9fa0>
cl_pem     = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmpgv909sa6.pem'
client_cert = <trustme.LeafCert object at 0x10d4dbf40>
expected_ssl_errors = (<class 'requests.exceptions.SSLError'>,)
interface  = '127.0.0.1'
is_trusted_cert = False
make_https_request = functools.partial(<function get at 0x10d2785e0>, 'https://127.0.0.1:63830/', verify='/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmprbkju_i3.pem', cert='/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmpgv909sa6.pem')
mocker     = <pytest_mock.MockFixture object at 0x10d3384c0>
port       = 63830
ssl_err    = <ExceptionInfo for raises contextmanager>
test_cert_rejection = True
tls_adapter = <cheroot.ssl.builtin.BuiltinSSLAdapter object at 0x10d34c2e0>
tls_adapter_cls = <class 'cheroot.ssl.builtin.BuiltinSSLAdapter'>
tls_ca_certificate_pem_path = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmprbkju_i3.pem'
tls_certificate = <trustme.LeafCert object at 0x10d2f7ca0>
tls_certificate_chain_pem_path = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmpodpxct6o.pem'
tls_certificate_private_key_pem_path = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmpw5nvnapi.pem'
tls_client_identity = 'localhost'
tls_http_server = <generator object tls_http_server.<locals>.start_srv at 0x10d30be40>
tls_verify_mode = <VerifyMode.CERT_OPTIONAL: 1>
tlshttpserver = <cheroot.server.HTTPServer object at 0x10d34c820>

cheroot/test/test_ssl.py:327: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.tox/python/lib/python3.8/site-packages/requests/api.py:75: in get
    return request('get', url, params=params, **kwargs)
.tox/python/lib/python3.8/site-packages/requests/api.py:60: in request
    return session.request(method=method, url=url, **kwargs)
.tox/python/lib/python3.8/site-packages/requests/sessions.py:533: in request
    resp = self.send(prep, **send_kwargs)
.tox/python/lib/python3.8/site-packages/requests/sessions.py:646: in send
    r = adapter.send(request, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.adapters.HTTPAdapter object at 0x10d475df0>
request = <PreparedRequest [GET]>, stream = False
timeout = <urllib3.util.timeout.Timeout object at 0x10d4926d0>
verify = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmprbkju_i3.pem'
cert = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmpgv909sa6.pem'
proxies = OrderedDict()

    def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
        """Sends PreparedRequest object. Returns Response object.
    
        :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
        :param stream: (optional) Whether to stream the request content.
        :param timeout: (optional) How long to wait for the server to send
            data before giving up, as a float, or a :ref:`(connect timeout,
            read timeout) <timeouts>` tuple.
        :type timeout: float or tuple or urllib3 Timeout object
        :param verify: (optional) Either a boolean, in which case it controls whether
            we verify the server's TLS certificate, or a string, in which case it
            must be a path to a CA bundle to use
        :param cert: (optional) Any user-provided SSL certificate to be trusted.
        :param proxies: (optional) The proxies dictionary to apply to the request.
        :rtype: requests.Response
        """
    
        try:
            conn = self.get_connection(request.url, proxies)
        except LocationValueError as e:
            raise InvalidURL(e, request=request)
    
        self.cert_verify(conn, request.url, verify, cert)
        url = self.request_url(request, proxies)
        self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
    
        chunked = not (request.body is None or 'Content-Length' in request.headers)
    
        if isinstance(timeout, tuple):
            try:
                connect, read = timeout
                timeout = TimeoutSauce(connect=connect, read=read)
            except ValueError as e:
                # this may raise a string formatting error.
                err = ("Invalid timeout {}. Pass a (connect, read) "
                       "timeout tuple, or a single float to set "
                       "both timeouts to the same value".format(timeout))
                raise ValueError(err)
        elif isinstance(timeout, TimeoutSauce):
            pass
        else:
            timeout = TimeoutSauce(connect=timeout, read=timeout)
    
        try:
            if not chunked:
                resp = conn.urlopen(
                    method=request.method,
                    url=url,
                    body=request.body,
                    headers=request.headers,
                    redirect=False,
                    assert_same_host=False,
                    preload_content=False,
                    decode_content=False,
                    retries=self.max_retries,
                    timeout=timeout
                )
    
            # Send the request.
            else:
                if hasattr(conn, 'proxy_pool'):
                    conn = conn.proxy_pool
    
                low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
    
                try:
                    low_conn.putrequest(request.method,
                                        url,
                                        skip_accept_encoding=True)
    
                    for header, value in request.headers.items():
                        low_conn.putheader(header, value)
    
                    low_conn.endheaders()
    
                    for i in request.body:
                        low_conn.send(hex(len(i))[2:].encode('utf-8'))
                        low_conn.send(b'\r\n')
                        low_conn.send(i)
                        low_conn.send(b'\r\n')
                    low_conn.send(b'0\r\n\r\n')
    
                    # Receive the response from the server
                    try:
                        # For Python 2.7, use buffering of HTTP responses
                        r = low_conn.getresponse(buffering=True)
                    except TypeError:
                        # For compatibility with Python 3.3+
                        r = low_conn.getresponse()
    
                    resp = HTTPResponse.from_httplib(
                        r,
                        pool=conn,
                        connection=low_conn,
                        preload_content=False,
                        decode_content=False
                    )
                except:
                    # If we hit any problems here, clean up the connection.
                    # Then, reraise so that we can handle the actual exception.
                    low_conn.close()
                    raise
    
        except (ProtocolError, socket.error) as err:
>           raise ConnectionError(err, request=request)
E           requests.exceptions.ConnectionError: ('Connection aborted.', OSError("(32, 'EPIPE')"))

cert       = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmpgv909sa6.pem'
chunked    = False
conn       = <urllib3.connectionpool.HTTPSConnectionPool object at 0x10d492820>
proxies    = OrderedDict()
request    = <PreparedRequest [GET]>
self       = <requests.adapters.HTTPAdapter object at 0x10d475df0>
stream     = False
timeout    = <urllib3.util.timeout.Timeout object at 0x10d4926d0>
url        = '/'
verify     = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmprbkju_i3.pem'

.tox/python/lib/python3.8/site-packages/requests/adapters.py:498: ConnectionError
____ test_tls_client_auth[VerifyMode.CERT_REQUIRED-False-localhost-builtin] ____
[gw2] darwin -- Python 3.8.0 /Users/jaraco/code/public/cherrypy/cheroot/.tox/python/bin/python

self = <urllib3.contrib.pyopenssl.WrappedSocket object at 0x109206940>
data = b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63861\r\nUser-Agent: python-requests/2.22.0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n'

    def _send_until_done(self, data):
        while True:
            try:
>               return self.connection.send(data)

data       = (b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63861\r\nUser-Agent: python-requests/2.22.'
 b'0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-ali'
 b've\r\n\r\n')
self       = <urllib3.contrib.pyopenssl.WrappedSocket object at 0x109206940>

.tox/python/lib/python3.8/site-packages/urllib3/contrib/pyopenssl.py:340: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <OpenSSL.SSL.Connection object at 0x109206d00>
buf = b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63861\r\nUser-Agent: python-requests/2.22.0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n'
flags = 0

    def send(self, buf, flags=0):
        """
        Send data on the connection. NOTE: If you get one of the WantRead,
        WantWrite or WantX509Lookup exceptions on this, you have to call the
        method again with the SAME buffer.
    
        :param buf: The string, buffer or memoryview to send
        :param flags: (optional) Included for compatibility with the socket
                      API, the value is ignored
        :return: The number of bytes written
        """
        # Backward compatibility
        buf = _text_to_bytes_and_warn("buf", buf)
    
        if isinstance(buf, memoryview):
            buf = buf.tobytes()
        if isinstance(buf, _buffer):
            buf = str(buf)
        if not isinstance(buf, bytes):
            raise TypeError("data must be a memoryview, buffer or byte string")
        if len(buf) > 2147483647:
            raise ValueError("Cannot send more than 2**31-1 bytes at once.")
    
        result = _lib.SSL_write(self._ssl, buf, len(buf))
>       self._raise_ssl_error(self._ssl, result)

buf        = (b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63861\r\nUser-Agent: python-requests/2.22.'
 b'0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-ali'
 b've\r\n\r\n')
flags      = 0
result     = -1
self       = <OpenSSL.SSL.Connection object at 0x109206d00>

.tox/python/lib/python3.8/site-packages/OpenSSL/SSL.py:1737: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <OpenSSL.SSL.Connection object at 0x109206d00>
ssl = <cdata 'SSL *' 0x7fb94b982800>, result = -1

    def _raise_ssl_error(self, ssl, result):
        if self._context._verify_helper is not None:
            self._context._verify_helper.raise_if_problem()
        if self._context._npn_advertise_helper is not None:
            self._context._npn_advertise_helper.raise_if_problem()
        if self._context._npn_select_helper is not None:
            self._context._npn_select_helper.raise_if_problem()
        if self._context._alpn_select_helper is not None:
            self._context._alpn_select_helper.raise_if_problem()
        if self._context._ocsp_helper is not None:
            self._context._ocsp_helper.raise_if_problem()
    
        error = _lib.SSL_get_error(ssl, result)
        if error == _lib.SSL_ERROR_WANT_READ:
            raise WantReadError()
        elif error == _lib.SSL_ERROR_WANT_WRITE:
            raise WantWriteError()
        elif error == _lib.SSL_ERROR_ZERO_RETURN:
            raise ZeroReturnError()
        elif error == _lib.SSL_ERROR_WANT_X509_LOOKUP:
            # TODO: This is untested.
            raise WantX509LookupError()
        elif error == _lib.SSL_ERROR_SYSCALL:
            if _lib.ERR_peek_error() == 0:
                if result < 0:
                    if platform == "win32":
                        errno = _ffi.getwinerror()[0]
                    else:
                        errno = _ffi.errno
    
                    if errno != 0:
>                       raise SysCallError(errno, errorcode.get(errno))
E                       OpenSSL.SSL.SysCallError: (32, 'EPIPE')

errno      = 32
error      = 5
result     = -1
self       = <OpenSSL.SSL.Connection object at 0x109206d00>
ssl        = <cdata 'SSL *' 0x7fb94b982800>

.tox/python/lib/python3.8/site-packages/OpenSSL/SSL.py:1639: SysCallError

During handling of the above exception, another exception occurred:

self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x1089e7af0>
method = 'GET', url = '/', body = None
headers = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = <urllib3.util.timeout.Timeout object at 0x10885c0a0>
pool_timeout = None, release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}, conn = None
release_this_conn = True, err = None, clean_exit = False
timeout_obj = <urllib3.util.timeout.Timeout object at 0x1089e7880>
is_new_proxy_conn = False

    def urlopen(
        self,
        method,
        url,
        body=None,
        headers=None,
        retries=None,
        redirect=True,
        assert_same_host=True,
        timeout=_Default,
        pool_timeout=None,
        release_conn=None,
        chunked=False,
        body_pos=None,
        **response_kw
    ):
        """
        Get a connection from the pool and perform an HTTP request. This is the
        lowest level call for making a request, so you'll need to specify all
        the raw details.
    
        .. note::
    
           More commonly, it's appropriate to use a convenience method provided
           by :class:`.RequestMethods`, such as :meth:`request`.
    
        .. note::
    
           `release_conn` will only behave as expected if
           `preload_content=False` because we want to make
           `preload_content=False` the default behaviour someday soon without
           breaking backwards compatibility.
    
        :param method:
            HTTP request method (such as GET, POST, PUT, etc.)
    
        :param body:
            Data to send in the request body (useful for creating
            POST requests, see HTTPConnectionPool.post_url for
            more convenience).
    
        :param headers:
            Dictionary of custom headers to send, such as User-Agent,
            If-None-Match, etc. If None, pool headers are used. If provided,
            these headers completely replace any pool-specific headers.
    
        :param retries:
            Configure the number of retries to allow before raising a
            :class:`~urllib3.exceptions.MaxRetryError` exception.
    
            Pass ``None`` to retry until you receive a response. Pass a
            :class:`~urllib3.util.retry.Retry` object for fine-grained control
            over different types of retries.
            Pass an integer number to retry connection errors that many times,
            but no other types of errors. Pass zero to never retry.
    
            If ``False``, then retries are disabled and any exception is raised
            immediately. Also, instead of raising a MaxRetryError on redirects,
            the redirect response will be returned.
    
        :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
        :param redirect:
            If True, automatically handle redirects (status codes 301, 302,
            303, 307, 308). Each redirect counts as a retry. Disabling retries
            will disable redirect, too.
    
        :param assert_same_host:
            If ``True``, will make sure that the host of the pool requests is
            consistent else will raise HostChangedError. When False, you can
            use the pool on an HTTP proxy and request foreign hosts.
    
        :param timeout:
            If specified, overrides the default timeout for this one
            request. It may be a float (in seconds) or an instance of
            :class:`urllib3.util.Timeout`.
    
        :param pool_timeout:
            If set and the pool is set to block=True, then this method will
            block for ``pool_timeout`` seconds and raise EmptyPoolError if no
            connection is available within the time period.
    
        :param release_conn:
            If False, then the urlopen call will not release the connection
            back into the pool once a response is received (but will release if
            you read the entire contents of the response such as when
            `preload_content=True`). This is useful if you're not preloading
            the response's content immediately. You will need to call
            ``r.release_conn()`` on the response ``r`` to return the connection
            back into the pool. If None, it takes the value of
            ``response_kw.get('preload_content', True)``.
    
        :param chunked:
            If True, urllib3 will send the body using chunked transfer
            encoding. Otherwise, urllib3 will send the body using the standard
            content-length form. Defaults to False.
    
        :param int body_pos:
            Position to seek to in file-like body in the event of a retry or
            redirect. Typically this won't need to be set because urllib3 will
            auto-populate the value when needed.
    
        :param \\**response_kw:
            Additional parameters are passed to
            :meth:`urllib3.response.HTTPResponse.from_httplib`
        """
        if headers is None:
            headers = self.headers
    
        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
    
        if release_conn is None:
            release_conn = response_kw.get("preload_content", True)
    
        # Check host
        if assert_same_host and not self.is_same_host(url):
            raise HostChangedError(self, url, retries)
    
        # Ensure that the URL we're connecting to is properly encoded
        if url.startswith("/"):
            url = six.ensure_str(_encode_target(url))
        else:
            url = six.ensure_str(parse_url(url).url)
    
        conn = None
    
        # Track whether `conn` needs to be released before
        # returning/raising/recursing. Update this variable if necessary, and
        # leave `release_conn` constant throughout the function. That way, if
        # the function recurses, the original value of `release_conn` will be
        # passed down into the recursive call, and its value will be respected.
        #
        # See issue #651 [1] for details.
        #
        # [1] <https://github.com/shazow/urllib3/issues/651>
        release_this_conn = release_conn
    
        # Merge the proxy headers. Only do this in HTTP. We have to copy the
        # headers dict so we can safely change it without those changes being
        # reflected in anyone else's copy.
        if self.scheme == "http":
            headers = headers.copy()
            headers.update(self.proxy_headers)
    
        # Must keep the exception bound to a separate variable or else Python 3
        # complains about UnboundLocalError.
        err = None
    
        # Keep track of whether we cleanly exited the except block. This
        # ensures we do proper cleanup in finally.
        clean_exit = False
    
        # Rewind body position, if needed. Record current position
        # for future rewinds in the event of a redirect/retry.
        body_pos = set_file_position(body, body_pos)
    
        try:
            # Request a connection from the queue.
            timeout_obj = self._get_timeout(timeout)
            conn = self._get_conn(timeout=pool_timeout)
    
            conn.timeout = timeout_obj.connect_timeout
    
            is_new_proxy_conn = self.proxy is not None and not getattr(
                conn, "sock", None
            )
            if is_new_proxy_conn:
                self._prepare_proxy(conn)
    
            # Make the request on the httplib connection object.
>           httplib_response = self._make_request(
                conn,
                method,
                url,
                timeout=timeout_obj,
                body=body,
                headers=headers,
                chunked=chunked,
            )

assert_same_host = False
body       = None
body_pos   = None
chunked    = False
clean_exit = False
conn       = None
err        = None
headers    = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
is_new_proxy_conn = False
method     = 'GET'
pool_timeout = None
redirect   = False
release_conn = False
release_this_conn = True
response_kw = {'decode_content': False, 'preload_content': False}
retries    = Retry(total=0, connect=None, read=False, redirect=None, status=None)
self       = <urllib3.connectionpool.HTTPSConnectionPool object at 0x1089e7af0>
timeout    = <urllib3.util.timeout.Timeout object at 0x10885c0a0>
timeout_obj = <urllib3.util.timeout.Timeout object at 0x1089e7880>
url        = '/'

.tox/python/lib/python3.8/site-packages/urllib3/connectionpool.py:665: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x1089e7af0>
conn = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>
method = 'GET', url = '/'
timeout = <urllib3.util.timeout.Timeout object at 0x1089e7880>, chunked = False
httplib_request_kw = {'body': None, 'headers': {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}}
timeout_obj = <urllib3.util.timeout.Timeout object at 0x109206040>

    def _make_request(
        self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
    ):
        """
        Perform a request on a given urllib connection object taken from our
        pool.
    
        :param conn:
            a connection from one of our connection pools
    
        :param timeout:
            Socket timeout in seconds for the request. This can be a
            float or integer, which will set the same timeout value for
            the socket connect and the socket read, or an instance of
            :class:`urllib3.util.Timeout`, which gives you more fine-grained
            control over your timeouts.
        """
        self.num_requests += 1
    
        timeout_obj = self._get_timeout(timeout)
        timeout_obj.start_connect()
        conn.timeout = timeout_obj.connect_timeout
    
        # Trigger any extra validation we need to do.
        try:
            self._validate_conn(conn)
        except (SocketTimeout, BaseSSLError) as e:
            # Py2 raises this as a BaseSSLError, Py3 raises it as socket timeout.
            self._raise_timeout(err=e, url=url, timeout_value=conn.timeout)
            raise
    
        # conn.request() calls httplib.*.request, not the method in
        # urllib3.request. It also calls makefile (recv) on the socket.
        if chunked:
            conn.request_chunked(method, url, **httplib_request_kw)
        else:
>           conn.request(method, url, **httplib_request_kw)

chunked    = False
conn       = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>
httplib_request_kw = {'body': None,
 'headers': {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}}
method     = 'GET'
self       = <urllib3.connectionpool.HTTPSConnectionPool object at 0x1089e7af0>
timeout    = <urllib3.util.timeout.Timeout object at 0x1089e7880>
timeout_obj = <urllib3.util.timeout.Timeout object at 0x109206040>
url        = '/'

.tox/python/lib/python3.8/site-packages/urllib3/connectionpool.py:387: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>
method = 'GET', url = '/', body = None
headers = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}

    def request(self, method, url, body=None, headers={}, *,
                encode_chunked=False):
        """Send a complete request to the server."""
>       self._send_request(method, url, body, headers, encode_chunked)

body       = None
encode_chunked = False
headers    = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
method     = 'GET'
self       = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>
url        = '/'

/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py:1230: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>
method = 'GET', url = '/', body = None
headers = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
encode_chunked = False

    def _send_request(self, method, url, body, headers, encode_chunked):
        # Honor explicitly requested Host: and Accept-Encoding: headers.
        header_names = frozenset(k.lower() for k in headers)
        skips = {}
        if 'host' in header_names:
            skips['skip_host'] = 1
        if 'accept-encoding' in header_names:
            skips['skip_accept_encoding'] = 1
    
        self.putrequest(method, url, **skips)
    
        # chunked encoding will happen if HTTP/1.1 is used and either
        # the caller passes encode_chunked=True or the following
        # conditions hold:
        # 1. content-length has not been explicitly set
        # 2. the body is a file or iterable, but not a str or bytes-like
        # 3. Transfer-Encoding has NOT been explicitly set by the caller
    
        if 'content-length' not in header_names:
            # only chunk body if not explicitly set for backwards
            # compatibility, assuming the client code is already handling the
            # chunking
            if 'transfer-encoding' not in header_names:
                # if content-length cannot be automatically determined, fall
                # back to chunked encoding
                encode_chunked = False
                content_length = self._get_content_length(body, method)
                if content_length is None:
                    if body is not None:
                        if self.debuglevel > 0:
                            print('Unable to determine size of %r' % body)
                        encode_chunked = True
                        self.putheader('Transfer-Encoding', 'chunked')
                else:
                    self.putheader('Content-Length', str(content_length))
        else:
            encode_chunked = False
    
        for hdr, value in headers.items():
            self.putheader(hdr, value)
        if isinstance(body, str):
            # RFC 2616 Section 3.7.1 says that text default has a
            # default charset of iso-8859-1.
            body = _encode(body, 'body')
>       self.endheaders(body, encode_chunked=encode_chunked)

body       = None
content_length = None
encode_chunked = False
hdr        = 'Connection'
header_names = frozenset({'connection', 'user-agent', 'accept', 'accept-encoding'})
headers    = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
method     = 'GET'
self       = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>
skips      = {'skip_accept_encoding': 1}
url        = '/'
value      = 'keep-alive'

/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py:1276: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>
message_body = None

    def endheaders(self, message_body=None, *, encode_chunked=False):
        """Indicate that the last header line has been sent to the server.
    
        This method sends the request to the server.  The optional message_body
        argument can be used to pass a message body associated with the
        request.
        """
        if self.__state == _CS_REQ_STARTED:
            self.__state = _CS_REQ_SENT
        else:
            raise CannotSendHeader()
>       self._send_output(message_body, encode_chunked=encode_chunked)

encode_chunked = False
message_body = None
self       = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>

/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py:1225: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>
message_body = None, encode_chunked = False

    def _send_output(self, message_body=None, encode_chunked=False):
        """Send the currently buffered request and clear the buffer.
    
        Appends an extra \\r\\n to the buffer.
        A message_body may be specified, to be appended to the request.
        """
        self._buffer.extend((b"", b""))
        msg = b"\r\n".join(self._buffer)
        del self._buffer[:]
>       self.send(msg)

encode_chunked = False
message_body = None
msg        = (b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63861\r\nUser-Agent: python-requests/2.22.'
 b'0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-ali'
 b've\r\n\r\n')
self       = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>

/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py:1004: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>
data = b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63861\r\nUser-Agent: python-requests/2.22.0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n'

    def send(self, data):
        """Send `data' to the server.
        ``data`` can be a string object, a bytes object, an array object, a
        file-like object that supports a .read() method, or an iterable object.
        """
    
        if self.sock is None:
            if self.auto_open:
                self.connect()
            else:
                raise NotConnected()
    
        if self.debuglevel > 0:
            print("send:", repr(data))
        if hasattr(data, "read") :
            if self.debuglevel > 0:
                print("sendIng a read()able")
            encode = self._is_textIO(data)
            if encode and self.debuglevel > 0:
                print("encoding file using iso-8859-1")
            while 1:
                datablock = data.read(self.blocksize)
                if not datablock:
                    break
                if encode:
                    datablock = datablock.encode("iso-8859-1")
                self.sock.sendall(datablock)
            return
        try:
>           self.sock.sendall(data)

data       = (b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63861\r\nUser-Agent: python-requests/2.22.'
 b'0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-ali'
 b've\r\n\r\n')
self       = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>

/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py:965: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.contrib.pyopenssl.WrappedSocket object at 0x109206940>
data = b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63861\r\nUser-Agent: python-requests/2.22.0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n'

    def sendall(self, data):
        total_sent = 0
        while total_sent < len(data):
>           sent = self._send_until_done(
                data[total_sent : total_sent + SSL_WRITE_BLOCKSIZE]
            )

data       = (b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63861\r\nUser-Agent: python-requests/2.22.'
 b'0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-ali'
 b've\r\n\r\n')
self       = <urllib3.contrib.pyopenssl.WrappedSocket object at 0x109206940>
total_sent = 0

.tox/python/lib/python3.8/site-packages/urllib3/contrib/pyopenssl.py:351: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.contrib.pyopenssl.WrappedSocket object at 0x109206940>
data = b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63861\r\nUser-Agent: python-requests/2.22.0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n'

    def _send_until_done(self, data):
        while True:
            try:
                return self.connection.send(data)
            except OpenSSL.SSL.WantWriteError:
                if not util.wait_for_write(self.socket, self.socket.gettimeout()):
                    raise timeout()
                continue
            except OpenSSL.SSL.SysCallError as e:
>               raise SocketError(str(e))
E               OSError: (32, 'EPIPE')

data       = (b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63861\r\nUser-Agent: python-requests/2.22.'
 b'0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-ali'
 b've\r\n\r\n')
self       = <urllib3.contrib.pyopenssl.WrappedSocket object at 0x109206940>

.tox/python/lib/python3.8/site-packages/urllib3/contrib/pyopenssl.py:346: OSError

During handling of the above exception, another exception occurred:

self = <requests.adapters.HTTPAdapter object at 0x109280b20>
request = <PreparedRequest [GET]>, stream = False
timeout = <urllib3.util.timeout.Timeout object at 0x10885c0a0>
verify = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmpcxbi29j0.pem'
cert = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmp4iji7q6c.pem'
proxies = OrderedDict()

    def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
        """Sends PreparedRequest object. Returns Response object.
    
        :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
        :param stream: (optional) Whether to stream the request content.
        :param timeout: (optional) How long to wait for the server to send
            data before giving up, as a float, or a :ref:`(connect timeout,
            read timeout) <timeouts>` tuple.
        :type timeout: float or tuple or urllib3 Timeout object
        :param verify: (optional) Either a boolean, in which case it controls whether
            we verify the server's TLS certificate, or a string, in which case it
            must be a path to a CA bundle to use
        :param cert: (optional) Any user-provided SSL certificate to be trusted.
        :param proxies: (optional) The proxies dictionary to apply to the request.
        :rtype: requests.Response
        """
    
        try:
            conn = self.get_connection(request.url, proxies)
        except LocationValueError as e:
            raise InvalidURL(e, request=request)
    
        self.cert_verify(conn, request.url, verify, cert)
        url = self.request_url(request, proxies)
        self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
    
        chunked = not (request.body is None or 'Content-Length' in request.headers)
    
        if isinstance(timeout, tuple):
            try:
                connect, read = timeout
                timeout = TimeoutSauce(connect=connect, read=read)
            except ValueError as e:
                # this may raise a string formatting error.
                err = ("Invalid timeout {}. Pass a (connect, read) "
                       "timeout tuple, or a single float to set "
                       "both timeouts to the same value".format(timeout))
                raise ValueError(err)
        elif isinstance(timeout, TimeoutSauce):
            pass
        else:
            timeout = TimeoutSauce(connect=timeout, read=timeout)
    
        try:
            if not chunked:
>               resp = conn.urlopen(
                    method=request.method,
                    url=url,
                    body=request.body,
                    headers=request.headers,
                    redirect=False,
                    assert_same_host=False,
                    preload_content=False,
                    decode_content=False,
                    retries=self.max_retries,
                    timeout=timeout
                )

cert       = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmp4iji7q6c.pem'
chunked    = False
conn       = <urllib3.connectionpool.HTTPSConnectionPool object at 0x1089e7af0>
proxies    = OrderedDict()
request    = <PreparedRequest [GET]>
self       = <requests.adapters.HTTPAdapter object at 0x109280b20>
stream     = False
timeout    = <urllib3.util.timeout.Timeout object at 0x10885c0a0>
url        = '/'
verify     = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmpcxbi29j0.pem'

.tox/python/lib/python3.8/site-packages/requests/adapters.py:439: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x1089e7af0>
method = 'GET', url = '/', body = None
headers = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = <urllib3.util.timeout.Timeout object at 0x10885c0a0>
pool_timeout = None, release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}, conn = None
release_this_conn = True, err = None, clean_exit = False
timeout_obj = <urllib3.util.timeout.Timeout object at 0x1089e7880>
is_new_proxy_conn = False

    def urlopen(
        self,
        method,
        url,
        body=None,
        headers=None,
        retries=None,
        redirect=True,
        assert_same_host=True,
        timeout=_Default,
        pool_timeout=None,
        release_conn=None,
        chunked=False,
        body_pos=None,
        **response_kw
    ):
        """
        Get a connection from the pool and perform an HTTP request. This is the
        lowest level call for making a request, so you'll need to specify all
        the raw details.
    
        .. note::
    
           More commonly, it's appropriate to use a convenience method provided
           by :class:`.RequestMethods`, such as :meth:`request`.
    
        .. note::
    
           `release_conn` will only behave as expected if
           `preload_content=False` because we want to make
           `preload_content=False` the default behaviour someday soon without
           breaking backwards compatibility.
    
        :param method:
            HTTP request method (such as GET, POST, PUT, etc.)
    
        :param body:
            Data to send in the request body (useful for creating
            POST requests, see HTTPConnectionPool.post_url for
            more convenience).
    
        :param headers:
            Dictionary of custom headers to send, such as User-Agent,
            If-None-Match, etc. If None, pool headers are used. If provided,
            these headers completely replace any pool-specific headers.
    
        :param retries:
            Configure the number of retries to allow before raising a
            :class:`~urllib3.exceptions.MaxRetryError` exception.
    
            Pass ``None`` to retry until you receive a response. Pass a
            :class:`~urllib3.util.retry.Retry` object for fine-grained control
            over different types of retries.
            Pass an integer number to retry connection errors that many times,
            but no other types of errors. Pass zero to never retry.
    
            If ``False``, then retries are disabled and any exception is raised
            immediately. Also, instead of raising a MaxRetryError on redirects,
            the redirect response will be returned.
    
        :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
        :param redirect:
            If True, automatically handle redirects (status codes 301, 302,
            303, 307, 308). Each redirect counts as a retry. Disabling retries
            will disable redirect, too.
    
        :param assert_same_host:
            If ``True``, will make sure that the host of the pool requests is
            consistent else will raise HostChangedError. When False, you can
            use the pool on an HTTP proxy and request foreign hosts.
    
        :param timeout:
            If specified, overrides the default timeout for this one
            request. It may be a float (in seconds) or an instance of
            :class:`urllib3.util.Timeout`.
    
        :param pool_timeout:
            If set and the pool is set to block=True, then this method will
            block for ``pool_timeout`` seconds and raise EmptyPoolError if no
            connection is available within the time period.
    
        :param release_conn:
            If False, then the urlopen call will not release the connection
            back into the pool once a response is received (but will release if
            you read the entire contents of the response such as when
            `preload_content=True`). This is useful if you're not preloading
            the response's content immediately. You will need to call
            ``r.release_conn()`` on the response ``r`` to return the connection
            back into the pool. If None, it takes the value of
            ``response_kw.get('preload_content', True)``.
    
        :param chunked:
            If True, urllib3 will send the body using chunked transfer
            encoding. Otherwise, urllib3 will send the body using the standard
            content-length form. Defaults to False.
    
        :param int body_pos:
            Position to seek to in file-like body in the event of a retry or
            redirect. Typically this won't need to be set because urllib3 will
            auto-populate the value when needed.
    
        :param \\**response_kw:
            Additional parameters are passed to
            :meth:`urllib3.response.HTTPResponse.from_httplib`
        """
        if headers is None:
            headers = self.headers
    
        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
    
        if release_conn is None:
            release_conn = response_kw.get("preload_content", True)
    
        # Check host
        if assert_same_host and not self.is_same_host(url):
            raise HostChangedError(self, url, retries)
    
        # Ensure that the URL we're connecting to is properly encoded
        if url.startswith("/"):
            url = six.ensure_str(_encode_target(url))
        else:
            url = six.ensure_str(parse_url(url).url)
    
        conn = None
    
        # Track whether `conn` needs to be released before
        # returning/raising/recursing. Update this variable if necessary, and
        # leave `release_conn` constant throughout the function. That way, if
        # the function recurses, the original value of `release_conn` will be
        # passed down into the recursive call, and its value will be respected.
        #
        # See issue #651 [1] for details.
        #
        # [1] <https://github.com/shazow/urllib3/issues/651>
        release_this_conn = release_conn
    
        # Merge the proxy headers. Only do this in HTTP. We have to copy the
        # headers dict so we can safely change it without those changes being
        # reflected in anyone else's copy.
        if self.scheme == "http":
            headers = headers.copy()
            headers.update(self.proxy_headers)
    
        # Must keep the exception bound to a separate variable or else Python 3
        # complains about UnboundLocalError.
        err = None
    
        # Keep track of whether we cleanly exited the except block. This
        # ensures we do proper cleanup in finally.
        clean_exit = False
    
        # Rewind body position, if needed. Record current position
        # for future rewinds in the event of a redirect/retry.
        body_pos = set_file_position(body, body_pos)
    
        try:
            # Request a connection from the queue.
            timeout_obj = self._get_timeout(timeout)
            conn = self._get_conn(timeout=pool_timeout)
    
            conn.timeout = timeout_obj.connect_timeout
    
            is_new_proxy_conn = self.proxy is not None and not getattr(
                conn, "sock", None
            )
            if is_new_proxy_conn:
                self._prepare_proxy(conn)
    
            # Make the request on the httplib connection object.
            httplib_response = self._make_request(
                conn,
                method,
                url,
                timeout=timeout_obj,
                body=body,
                headers=headers,
                chunked=chunked,
            )
    
            # If we're going to release the connection in ``finally:``, then
            # the response doesn't need to know about the connection. Otherwise
            # it will also try to release it and we'll have a double-release
            # mess.
            response_conn = conn if not release_conn else None
    
            # Pass method to Response for length checking
            response_kw["request_method"] = method
    
            # Import httplib's response into our own wrapper object
            response = self.ResponseCls.from_httplib(
                httplib_response,
                pool=self,
                connection=response_conn,
                retries=retries,
                **response_kw
            )
    
            # Everything went great!
            clean_exit = True
    
        except queue.Empty:
            # Timed out by queue.
            raise EmptyPoolError(self, "No pool connections are available.")
    
        except (
            TimeoutError,
            HTTPException,
            SocketError,
            ProtocolError,
            BaseSSLError,
            SSLError,
            CertificateError,
        ) as e:
            # Discard the connection for these exceptions. It will be
            # replaced during the next _get_conn() call.
            clean_exit = False
            if isinstance(e, (BaseSSLError, CertificateError)):
                e = SSLError(e)
            elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
                e = ProxyError("Cannot connect to proxy.", e)
            elif isinstance(e, (SocketError, HTTPException)):
                e = ProtocolError("Connection aborted.", e)
    
>           retries = retries.increment(
                method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
            )

assert_same_host = False
body       = None
body_pos   = None
chunked    = False
clean_exit = False
conn       = None
err        = None
headers    = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
is_new_proxy_conn = False
method     = 'GET'
pool_timeout = None
redirect   = False
release_conn = False
release_this_conn = True
response_kw = {'decode_content': False, 'preload_content': False}
retries    = Retry(total=0, connect=None, read=False, redirect=None, status=None)
self       = <urllib3.connectionpool.HTTPSConnectionPool object at 0x1089e7af0>
timeout    = <urllib3.util.timeout.Timeout object at 0x10885c0a0>
timeout_obj = <urllib3.util.timeout.Timeout object at 0x1089e7880>
url        = '/'

.tox/python/lib/python3.8/site-packages/urllib3/connectionpool.py:719: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
method = 'GET', url = '/', response = None
error = ProtocolError('Connection aborted.', OSError("(32, 'EPIPE')"))
_pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x1089e7af0>
_stacktrace = <traceback object at 0x109214b80>

    def increment(
        self,
        method=None,
        url=None,
        response=None,
        error=None,
        _pool=None,
        _stacktrace=None,
    ):
        """ Return a new Retry object with incremented retry counters.
    
        :param response: A response object, or None, if the server did not
            return a response.
        :type response: :class:`~urllib3.response.HTTPResponse`
        :param Exception error: An error encountered during the request, or
            None if the response was received successfully.
    
        :return: A new ``Retry`` object.
        """
        if self.total is False and error:
            # Disabled, indicate to re-raise the error.
            raise six.reraise(type(error), error, _stacktrace)
    
        total = self.total
        if total is not None:
            total -= 1
    
        connect = self.connect
        read = self.read
        redirect = self.redirect
        status_count = self.status
        cause = "unknown"
        status = None
        redirect_location = None
    
        if error and self._is_connection_error(error):
            # Connect retry?
            if connect is False:
                raise six.reraise(type(error), error, _stacktrace)
            elif connect is not None:
                connect -= 1
    
        elif error and self._is_read_error(error):
            # Read retry?
            if read is False or not self._is_method_retryable(method):
>               raise six.reraise(type(error), error, _stacktrace)

_pool      = <urllib3.connectionpool.HTTPSConnectionPool object at 0x1089e7af0>
_stacktrace = <traceback object at 0x109214b80>
cause      = 'unknown'
connect    = None
error      = ProtocolError('Connection aborted.', OSError("(32, 'EPIPE')"))
method     = 'GET'
read       = False
redirect   = None
redirect_location = None
response   = None
self       = Retry(total=0, connect=None, read=False, redirect=None, status=None)
status     = None
status_count = None
total      = -1
url        = '/'

.tox/python/lib/python3.8/site-packages/urllib3/util/retry.py:400: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

tp = <class 'urllib3.exceptions.ProtocolError'>, value = None, tb = None

    def reraise(tp, value, tb=None):
        try:
            if value is None:
                value = tp()
            if value.__traceback__ is not tb:
>               raise value.with_traceback(tb)

tb         = None
tp         = <class 'urllib3.exceptions.ProtocolError'>
value      = None

.tox/python/lib/python3.8/site-packages/urllib3/packages/six.py:734: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x1089e7af0>
method = 'GET', url = '/', body = None
headers = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
redirect = False, assert_same_host = False
timeout = <urllib3.util.timeout.Timeout object at 0x10885c0a0>
pool_timeout = None, release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}, conn = None
release_this_conn = True, err = None, clean_exit = False
timeout_obj = <urllib3.util.timeout.Timeout object at 0x1089e7880>
is_new_proxy_conn = False

    def urlopen(
        self,
        method,
        url,
        body=None,
        headers=None,
        retries=None,
        redirect=True,
        assert_same_host=True,
        timeout=_Default,
        pool_timeout=None,
        release_conn=None,
        chunked=False,
        body_pos=None,
        **response_kw
    ):
        """
        Get a connection from the pool and perform an HTTP request. This is the
        lowest level call for making a request, so you'll need to specify all
        the raw details.
    
        .. note::
    
           More commonly, it's appropriate to use a convenience method provided
           by :class:`.RequestMethods`, such as :meth:`request`.
    
        .. note::
    
           `release_conn` will only behave as expected if
           `preload_content=False` because we want to make
           `preload_content=False` the default behaviour someday soon without
           breaking backwards compatibility.
    
        :param method:
            HTTP request method (such as GET, POST, PUT, etc.)
    
        :param body:
            Data to send in the request body (useful for creating
            POST requests, see HTTPConnectionPool.post_url for
            more convenience).
    
        :param headers:
            Dictionary of custom headers to send, such as User-Agent,
            If-None-Match, etc. If None, pool headers are used. If provided,
            these headers completely replace any pool-specific headers.
    
        :param retries:
            Configure the number of retries to allow before raising a
            :class:`~urllib3.exceptions.MaxRetryError` exception.
    
            Pass ``None`` to retry until you receive a response. Pass a
            :class:`~urllib3.util.retry.Retry` object for fine-grained control
            over different types of retries.
            Pass an integer number to retry connection errors that many times,
            but no other types of errors. Pass zero to never retry.
    
            If ``False``, then retries are disabled and any exception is raised
            immediately. Also, instead of raising a MaxRetryError on redirects,
            the redirect response will be returned.
    
        :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
        :param redirect:
            If True, automatically handle redirects (status codes 301, 302,
            303, 307, 308). Each redirect counts as a retry. Disabling retries
            will disable redirect, too.
    
        :param assert_same_host:
            If ``True``, will make sure that the host of the pool requests is
            consistent else will raise HostChangedError. When False, you can
            use the pool on an HTTP proxy and request foreign hosts.
    
        :param timeout:
            If specified, overrides the default timeout for this one
            request. It may be a float (in seconds) or an instance of
            :class:`urllib3.util.Timeout`.
    
        :param pool_timeout:
            If set and the pool is set to block=True, then this method will
            block for ``pool_timeout`` seconds and raise EmptyPoolError if no
            connection is available within the time period.
    
        :param release_conn:
            If False, then the urlopen call will not release the connection
            back into the pool once a response is received (but will release if
            you read the entire contents of the response such as when
            `preload_content=True`). This is useful if you're not preloading
            the response's content immediately. You will need to call
            ``r.release_conn()`` on the response ``r`` to return the connection
            back into the pool. If None, it takes the value of
            ``response_kw.get('preload_content', True)``.
    
        :param chunked:
            If True, urllib3 will send the body using chunked transfer
            encoding. Otherwise, urllib3 will send the body using the standard
            content-length form. Defaults to False.
    
        :param int body_pos:
            Position to seek to in file-like body in the event of a retry or
            redirect. Typically this won't need to be set because urllib3 will
            auto-populate the value when needed.
    
        :param \\**response_kw:
            Additional parameters are passed to
            :meth:`urllib3.response.HTTPResponse.from_httplib`
        """
        if headers is None:
            headers = self.headers
    
        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
    
        if release_conn is None:
            release_conn = response_kw.get("preload_content", True)
    
        # Check host
        if assert_same_host and not self.is_same_host(url):
            raise HostChangedError(self, url, retries)
    
        # Ensure that the URL we're connecting to is properly encoded
        if url.startswith("/"):
            url = six.ensure_str(_encode_target(url))
        else:
            url = six.ensure_str(parse_url(url).url)
    
        conn = None
    
        # Track whether `conn` needs to be released before
        # returning/raising/recursing. Update this variable if necessary, and
        # leave `release_conn` constant throughout the function. That way, if
        # the function recurses, the original value of `release_conn` will be
        # passed down into the recursive call, and its value will be respected.
        #
        # See issue #651 [1] for details.
        #
        # [1] <https://github.com/shazow/urllib3/issues/651>
        release_this_conn = release_conn
    
        # Merge the proxy headers. Only do this in HTTP. We have to copy the
        # headers dict so we can safely change it without those changes being
        # reflected in anyone else's copy.
        if self.scheme == "http":
            headers = headers.copy()
            headers.update(self.proxy_headers)
    
        # Must keep the exception bound to a separate variable or else Python 3
        # complains about UnboundLocalError.
        err = None
    
        # Keep track of whether we cleanly exited the except block. This
        # ensures we do proper cleanup in finally.
        clean_exit = False
    
        # Rewind body position, if needed. Record current position
        # for future rewinds in the event of a redirect/retry.
        body_pos = set_file_position(body, body_pos)
    
        try:
            # Request a connection from the queue.
            timeout_obj = self._get_timeout(timeout)
            conn = self._get_conn(timeout=pool_timeout)
    
            conn.timeout = timeout_obj.connect_timeout
    
            is_new_proxy_conn = self.proxy is not None and not getattr(
                conn, "sock", None
            )
            if is_new_proxy_conn:
                self._prepare_proxy(conn)
    
            # Make the request on the httplib connection object.
>           httplib_response = self._make_request(
                conn,
                method,
                url,
                timeout=timeout_obj,
                body=body,
                headers=headers,
                chunked=chunked,
            )

assert_same_host = False
body       = None
body_pos   = None
chunked    = False
clean_exit = False
conn       = None
err        = None
headers    = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
is_new_proxy_conn = False
method     = 'GET'
pool_timeout = None
redirect   = False
release_conn = False
release_this_conn = True
response_kw = {'decode_content': False, 'preload_content': False}
retries    = Retry(total=0, connect=None, read=False, redirect=None, status=None)
self       = <urllib3.connectionpool.HTTPSConnectionPool object at 0x1089e7af0>
timeout    = <urllib3.util.timeout.Timeout object at 0x10885c0a0>
timeout_obj = <urllib3.util.timeout.Timeout object at 0x1089e7880>
url        = '/'

.tox/python/lib/python3.8/site-packages/urllib3/connectionpool.py:665: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x1089e7af0>
conn = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>
method = 'GET', url = '/'
timeout = <urllib3.util.timeout.Timeout object at 0x1089e7880>, chunked = False
httplib_request_kw = {'body': None, 'headers': {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}}
timeout_obj = <urllib3.util.timeout.Timeout object at 0x109206040>

    def _make_request(
        self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
    ):
        """
        Perform a request on a given urllib connection object taken from our
        pool.
    
        :param conn:
            a connection from one of our connection pools
    
        :param timeout:
            Socket timeout in seconds for the request. This can be a
            float or integer, which will set the same timeout value for
            the socket connect and the socket read, or an instance of
            :class:`urllib3.util.Timeout`, which gives you more fine-grained
            control over your timeouts.
        """
        self.num_requests += 1
    
        timeout_obj = self._get_timeout(timeout)
        timeout_obj.start_connect()
        conn.timeout = timeout_obj.connect_timeout
    
        # Trigger any extra validation we need to do.
        try:
            self._validate_conn(conn)
        except (SocketTimeout, BaseSSLError) as e:
            # Py2 raises this as a BaseSSLError, Py3 raises it as socket timeout.
            self._raise_timeout(err=e, url=url, timeout_value=conn.timeout)
            raise
    
        # conn.request() calls httplib.*.request, not the method in
        # urllib3.request. It also calls makefile (recv) on the socket.
        if chunked:
            conn.request_chunked(method, url, **httplib_request_kw)
        else:
>           conn.request(method, url, **httplib_request_kw)

chunked    = False
conn       = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>
httplib_request_kw = {'body': None,
 'headers': {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}}
method     = 'GET'
self       = <urllib3.connectionpool.HTTPSConnectionPool object at 0x1089e7af0>
timeout    = <urllib3.util.timeout.Timeout object at 0x1089e7880>
timeout_obj = <urllib3.util.timeout.Timeout object at 0x109206040>
url        = '/'

.tox/python/lib/python3.8/site-packages/urllib3/connectionpool.py:387: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>
method = 'GET', url = '/', body = None
headers = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}

    def request(self, method, url, body=None, headers={}, *,
                encode_chunked=False):
        """Send a complete request to the server."""
>       self._send_request(method, url, body, headers, encode_chunked)

body       = None
encode_chunked = False
headers    = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
method     = 'GET'
self       = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>
url        = '/'

/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py:1230: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>
method = 'GET', url = '/', body = None
headers = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
encode_chunked = False

    def _send_request(self, method, url, body, headers, encode_chunked):
        # Honor explicitly requested Host: and Accept-Encoding: headers.
        header_names = frozenset(k.lower() for k in headers)
        skips = {}
        if 'host' in header_names:
            skips['skip_host'] = 1
        if 'accept-encoding' in header_names:
            skips['skip_accept_encoding'] = 1
    
        self.putrequest(method, url, **skips)
    
        # chunked encoding will happen if HTTP/1.1 is used and either
        # the caller passes encode_chunked=True or the following
        # conditions hold:
        # 1. content-length has not been explicitly set
        # 2. the body is a file or iterable, but not a str or bytes-like
        # 3. Transfer-Encoding has NOT been explicitly set by the caller
    
        if 'content-length' not in header_names:
            # only chunk body if not explicitly set for backwards
            # compatibility, assuming the client code is already handling the
            # chunking
            if 'transfer-encoding' not in header_names:
                # if content-length cannot be automatically determined, fall
                # back to chunked encoding
                encode_chunked = False
                content_length = self._get_content_length(body, method)
                if content_length is None:
                    if body is not None:
                        if self.debuglevel > 0:
                            print('Unable to determine size of %r' % body)
                        encode_chunked = True
                        self.putheader('Transfer-Encoding', 'chunked')
                else:
                    self.putheader('Content-Length', str(content_length))
        else:
            encode_chunked = False
    
        for hdr, value in headers.items():
            self.putheader(hdr, value)
        if isinstance(body, str):
            # RFC 2616 Section 3.7.1 says that text default has a
            # default charset of iso-8859-1.
            body = _encode(body, 'body')
>       self.endheaders(body, encode_chunked=encode_chunked)

body       = None
content_length = None
encode_chunked = False
hdr        = 'Connection'
header_names = frozenset({'connection', 'user-agent', 'accept', 'accept-encoding'})
headers    = {'User-Agent': 'python-requests/2.22.0', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
method     = 'GET'
self       = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>
skips      = {'skip_accept_encoding': 1}
url        = '/'
value      = 'keep-alive'

/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py:1276: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>
message_body = None

    def endheaders(self, message_body=None, *, encode_chunked=False):
        """Indicate that the last header line has been sent to the server.
    
        This method sends the request to the server.  The optional message_body
        argument can be used to pass a message body associated with the
        request.
        """
        if self.__state == _CS_REQ_STARTED:
            self.__state = _CS_REQ_SENT
        else:
            raise CannotSendHeader()
>       self._send_output(message_body, encode_chunked=encode_chunked)

encode_chunked = False
message_body = None
self       = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>

/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py:1225: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>
message_body = None, encode_chunked = False

    def _send_output(self, message_body=None, encode_chunked=False):
        """Send the currently buffered request and clear the buffer.
    
        Appends an extra \\r\\n to the buffer.
        A message_body may be specified, to be appended to the request.
        """
        self._buffer.extend((b"", b""))
        msg = b"\r\n".join(self._buffer)
        del self._buffer[:]
>       self.send(msg)

encode_chunked = False
message_body = None
msg        = (b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63861\r\nUser-Agent: python-requests/2.22.'
 b'0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-ali'
 b've\r\n\r\n')
self       = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>

/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py:1004: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>
data = b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63861\r\nUser-Agent: python-requests/2.22.0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n'

    def send(self, data):
        """Send `data' to the server.
        ``data`` can be a string object, a bytes object, an array object, a
        file-like object that supports a .read() method, or an iterable object.
        """
    
        if self.sock is None:
            if self.auto_open:
                self.connect()
            else:
                raise NotConnected()
    
        if self.debuglevel > 0:
            print("send:", repr(data))
        if hasattr(data, "read") :
            if self.debuglevel > 0:
                print("sendIng a read()able")
            encode = self._is_textIO(data)
            if encode and self.debuglevel > 0:
                print("encoding file using iso-8859-1")
            while 1:
                datablock = data.read(self.blocksize)
                if not datablock:
                    break
                if encode:
                    datablock = datablock.encode("iso-8859-1")
                self.sock.sendall(datablock)
            return
        try:
>           self.sock.sendall(data)

data       = (b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63861\r\nUser-Agent: python-requests/2.22.'
 b'0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-ali'
 b've\r\n\r\n')
self       = <urllib3.connection.VerifiedHTTPSConnection object at 0x109206100>

/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py:965: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.contrib.pyopenssl.WrappedSocket object at 0x109206940>
data = b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63861\r\nUser-Agent: python-requests/2.22.0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n'

    def sendall(self, data):
        total_sent = 0
        while total_sent < len(data):
>           sent = self._send_until_done(
                data[total_sent : total_sent + SSL_WRITE_BLOCKSIZE]
            )

data       = (b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63861\r\nUser-Agent: python-requests/2.22.'
 b'0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-ali'
 b've\r\n\r\n')
self       = <urllib3.contrib.pyopenssl.WrappedSocket object at 0x109206940>
total_sent = 0

.tox/python/lib/python3.8/site-packages/urllib3/contrib/pyopenssl.py:351: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.contrib.pyopenssl.WrappedSocket object at 0x109206940>
data = b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63861\r\nUser-Agent: python-requests/2.22.0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n'

    def _send_until_done(self, data):
        while True:
            try:
                return self.connection.send(data)
            except OpenSSL.SSL.WantWriteError:
                if not util.wait_for_write(self.socket, self.socket.gettimeout()):
                    raise timeout()
                continue
            except OpenSSL.SSL.SysCallError as e:
>               raise SocketError(str(e))
E               urllib3.exceptions.ProtocolError: ('Connection aborted.', OSError("(32, 'EPIPE')"))

data       = (b'GET / HTTP/1.1\r\nHost: 127.0.0.1:63861\r\nUser-Agent: python-requests/2.22.'
 b'0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-ali'
 b've\r\n\r\n')
self       = <urllib3.contrib.pyopenssl.WrappedSocket object at 0x109206940>

.tox/python/lib/python3.8/site-packages/urllib3/contrib/pyopenssl.py:346: ProtocolError

During handling of the above exception, another exception occurred:

mocker = <pytest_mock.MockFixture object at 0x10885c190>
tls_http_server = <generator object tls_http_server.<locals>.start_srv at 0x10885d350>
adapter_type = 'builtin', ca = <trustme.CA object at 0x10885c280>
tls_certificate = <trustme.LeafCert object at 0x109247220>
tls_certificate_chain_pem_path = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmpql60xhsf.pem'
tls_certificate_private_key_pem_path = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmpv75zyc5g.pem'
tls_ca_certificate_pem_path = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmpcxbi29j0.pem'
is_trusted_cert = False, tls_client_identity = 'localhost'
tls_verify_mode = <VerifyMode.CERT_REQUIRED: 2>

    @pytest.mark.parametrize(
        'adapter_type',
        (
            'builtin',
            'pyopenssl',
        ),
    )
    @pytest.mark.parametrize(
        'is_trusted_cert,tls_client_identity',
        (
            (True, 'localhost'), (True, '127.0.0.1'),
            (True, '*.localhost'), (True, 'not_localhost'),
            (False, 'localhost'),
        ),
    )
    @pytest.mark.parametrize(
        'tls_verify_mode',
        (
            ssl.CERT_NONE,  # server shouldn't validate client cert
            ssl.CERT_OPTIONAL,  # same as CERT_REQUIRED in client mode, don't use
            ssl.CERT_REQUIRED,  # server should validate if client cert CA is OK
        ),
    )
    def test_tls_client_auth(
        # FIXME: remove twisted logic, separate tests
        mocker,
        tls_http_server, adapter_type,
        ca,
        tls_certificate,
        tls_certificate_chain_pem_path,
        tls_certificate_private_key_pem_path,
        tls_ca_certificate_pem_path,
        is_trusted_cert, tls_client_identity,
        tls_verify_mode,
    ):
        """Verify that client TLS certificate auth works correctly."""
        test_cert_rejection = (
            tls_verify_mode != ssl.CERT_NONE
            and not is_trusted_cert
        )
        interface, _host, port = _get_conn_data(ANY_INTERFACE_IPV4)
    
        client_cert_root_ca = ca if is_trusted_cert else trustme.CA()
        with mocker.mock_module.patch(
            'idna.core.ulabel',
            return_value=ntob(tls_client_identity),
        ):
            client_cert = client_cert_root_ca.issue_server_cert(
                # FIXME: change to issue_cert once new trustme is out
                ntou(tls_client_identity),
            )
            del client_cert_root_ca
    
        with client_cert.private_key_and_cert_chain_pem.tempfile() as cl_pem:
            tls_adapter_cls = get_ssl_adapter_class(name=adapter_type)
            tls_adapter = tls_adapter_cls(
                tls_certificate_chain_pem_path,
                tls_certificate_private_key_pem_path,
            )
            if adapter_type == 'pyopenssl':
                tls_adapter.context = tls_adapter.get_context()
                tls_adapter.context.set_verify(
                    _stdlib_to_openssl_verify[tls_verify_mode],
                    lambda conn, cert, errno, depth, preverify_ok: preverify_ok,
                )
            else:
                tls_adapter.context.verify_mode = tls_verify_mode
    
            ca.configure_trust(tls_adapter.context)
            tls_certificate.configure_cert(tls_adapter.context)
    
            tlshttpserver = tls_http_server.send(
                (
                    (interface, port),
                    tls_adapter,
                ),
            )
    
            interface, _host, port = _get_conn_data(tlshttpserver.bind_addr)
    
            make_https_request = functools.partial(
                requests.get,
                'https://' + interface + ':' + str(port) + '/',
    
                # Server TLS certificate verification:
                verify=tls_ca_certificate_pem_path,
    
                # Client TLS certificate verification:
                cert=cl_pem,
            )
    
            if not test_cert_rejection:
                resp = make_https_request()
                is_req_successful = resp.status_code == 200
                if (
                        not is_req_successful
                        and IS_PYOPENSSL_SSL_VERSION_1_0
                        and adapter_type == 'builtin'
                        and tls_verify_mode == ssl.CERT_REQUIRED
                        and tls_client_identity == 'localhost'
                        and is_trusted_cert
                ) or PY34:
                    pytest.xfail(
                        'OpenSSL 1.0 has problems with verifying client certs',
                    )
                assert is_req_successful
                assert resp.text == 'Hello world!'
                return
    
            expected_ssl_errors = (
                requests.exceptions.SSLError,
                OpenSSL.SSL.Error,
            ) if PY34 else (
                requests.exceptions.SSLError,
            )
            if IS_WINDOWS or IS_GITHUB_ACTIONS_WORKFLOW:
                expected_ssl_errors += requests.exceptions.ConnectionError,
            with pytest.raises(expected_ssl_errors) as ssl_err:
>               make_https_request()

_host      = '127.0.0.1'
adapter_type = 'builtin'
ca         = <trustme.CA object at 0x10885c280>
cl_pem     = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmp4iji7q6c.pem'
client_cert = <trustme.LeafCert object at 0x1092107c0>
expected_ssl_errors = (<class 'requests.exceptions.SSLError'>,)
interface  = '127.0.0.1'
is_trusted_cert = False
make_https_request = functools.partial(<function get at 0x1087505e0>, 'https://127.0.0.1:63861/', verify='/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmpcxbi29j0.pem', cert='/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmp4iji7q6c.pem')
mocker     = <pytest_mock.MockFixture object at 0x10885c190>
port       = 63861
ssl_err    = <ExceptionInfo for raises contextmanager>
test_cert_rejection = True
tls_adapter = <cheroot.ssl.builtin.BuiltinSSLAdapter object at 0x10925b730>
tls_adapter_cls = <class 'cheroot.ssl.builtin.BuiltinSSLAdapter'>
tls_ca_certificate_pem_path = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmpcxbi29j0.pem'
tls_certificate = <trustme.LeafCert object at 0x109247220>
tls_certificate_chain_pem_path = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmpql60xhsf.pem'
tls_certificate_private_key_pem_path = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmpv75zyc5g.pem'
tls_client_identity = 'localhost'
tls_http_server = <generator object tls_http_server.<locals>.start_srv at 0x10885d350>
tls_verify_mode = <VerifyMode.CERT_REQUIRED: 2>
tlshttpserver = <cheroot.server.HTTPServer object at 0x10925be20>

cheroot/test/test_ssl.py:327: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.tox/python/lib/python3.8/site-packages/requests/api.py:75: in get
    return request('get', url, params=params, **kwargs)
.tox/python/lib/python3.8/site-packages/requests/api.py:60: in request
    return session.request(method=method, url=url, **kwargs)
.tox/python/lib/python3.8/site-packages/requests/sessions.py:533: in request
    resp = self.send(prep, **send_kwargs)
.tox/python/lib/python3.8/site-packages/requests/sessions.py:646: in send
    r = adapter.send(request, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.adapters.HTTPAdapter object at 0x109280b20>
request = <PreparedRequest [GET]>, stream = False
timeout = <urllib3.util.timeout.Timeout object at 0x10885c0a0>
verify = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmpcxbi29j0.pem'
cert = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmp4iji7q6c.pem'
proxies = OrderedDict()

    def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
        """Sends PreparedRequest object. Returns Response object.
    
        :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
        :param stream: (optional) Whether to stream the request content.
        :param timeout: (optional) How long to wait for the server to send
            data before giving up, as a float, or a :ref:`(connect timeout,
            read timeout) <timeouts>` tuple.
        :type timeout: float or tuple or urllib3 Timeout object
        :param verify: (optional) Either a boolean, in which case it controls whether
            we verify the server's TLS certificate, or a string, in which case it
            must be a path to a CA bundle to use
        :param cert: (optional) Any user-provided SSL certificate to be trusted.
        :param proxies: (optional) The proxies dictionary to apply to the request.
        :rtype: requests.Response
        """
    
        try:
            conn = self.get_connection(request.url, proxies)
        except LocationValueError as e:
            raise InvalidURL(e, request=request)
    
        self.cert_verify(conn, request.url, verify, cert)
        url = self.request_url(request, proxies)
        self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
    
        chunked = not (request.body is None or 'Content-Length' in request.headers)
    
        if isinstance(timeout, tuple):
            try:
                connect, read = timeout
                timeout = TimeoutSauce(connect=connect, read=read)
            except ValueError as e:
                # this may raise a string formatting error.
                err = ("Invalid timeout {}. Pass a (connect, read) "
                       "timeout tuple, or a single float to set "
                       "both timeouts to the same value".format(timeout))
                raise ValueError(err)
        elif isinstance(timeout, TimeoutSauce):
            pass
        else:
            timeout = TimeoutSauce(connect=timeout, read=timeout)
    
        try:
            if not chunked:
                resp = conn.urlopen(
                    method=request.method,
                    url=url,
                    body=request.body,
                    headers=request.headers,
                    redirect=False,
                    assert_same_host=False,
                    preload_content=False,
                    decode_content=False,
                    retries=self.max_retries,
                    timeout=timeout
                )
    
            # Send the request.
            else:
                if hasattr(conn, 'proxy_pool'):
                    conn = conn.proxy_pool
    
                low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
    
                try:
                    low_conn.putrequest(request.method,
                                        url,
                                        skip_accept_encoding=True)
    
                    for header, value in request.headers.items():
                        low_conn.putheader(header, value)
    
                    low_conn.endheaders()
    
                    for i in request.body:
                        low_conn.send(hex(len(i))[2:].encode('utf-8'))
                        low_conn.send(b'\r\n')
                        low_conn.send(i)
                        low_conn.send(b'\r\n')
                    low_conn.send(b'0\r\n\r\n')
    
                    # Receive the response from the server
                    try:
                        # For Python 2.7, use buffering of HTTP responses
                        r = low_conn.getresponse(buffering=True)
                    except TypeError:
                        # For compatibility with Python 3.3+
                        r = low_conn.getresponse()
    
                    resp = HTTPResponse.from_httplib(
                        r,
                        pool=conn,
                        connection=low_conn,
                        preload_content=False,
                        decode_content=False
                    )
                except:
                    # If we hit any problems here, clean up the connection.
                    # Then, reraise so that we can handle the actual exception.
                    low_conn.close()
                    raise
    
        except (ProtocolError, socket.error) as err:
>           raise ConnectionError(err, request=request)
E           requests.exceptions.ConnectionError: ('Connection aborted.', OSError("(32, 'EPIPE')"))

cert       = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmp4iji7q6c.pem'
chunked    = False
conn       = <urllib3.connectionpool.HTTPSConnectionPool object at 0x1089e7af0>
proxies    = OrderedDict()
request    = <PreparedRequest [GET]>
self       = <requests.adapters.HTTPAdapter object at 0x109280b20>
stream     = False
timeout    = <urllib3.util.timeout.Timeout object at 0x10885c0a0>
url        = '/'
verify     = '/var/folders/qs/5jptvz2x7_gblx4kc3qj005800n8zm/T/tmpcxbi29j0.pem'

.tox/python/lib/python3.8/site-packages/requests/adapters.py:498: ConnectionError
- generated xml file: /Users/jaraco/code/public/cherrypy/cheroot/junit-test-results.xml -

-------- coverage: platform darwin, python 3.8.0-candidate-1 ---------
Name                            Stmts   Miss  Cover   Missing
-------------------------------------------------------------
cheroot/__init__.py                10      4    60%   8-9, 14-15
cheroot/__main__.py                 3      3     0%   3-6
cheroot/_compat.py                 49     13    73%   15-16, 49-77, 99
cheroot/cli.py                     71     71     0%   24-234
cheroot/connections.py            143     31    78%   19-41, 92, 153-160, 176, 189, 218, 237-242, 253, 256, 264, 268, 273
cheroot/makefile.py               302    240    21%   11-13, 32, 47, 56-59, 62, 65-68, 72-82, 86-88, 92-95, 100-110, 116, 132-190, 194-282, 286-406, 447
cheroot/server.py                 938    280    70%   80-81, 122-129, 134-137, 197, 203, 213, 224, 229, 246-247, 266, 278-281, 294-297, 322-331, 335, 339, 343-346, 376, 396-405, 418-427, 431, 435, 439-441, 471, 477, 486-487, 496, 504, 522, 526, 535-537, 552-583, 596-605, 614-635, 639, 722-727, 734-740, 774-776, 824, 846-872, 917-921, 938-940, 947, 970-971, 1032-1034, 1061-1063, 1074-1080, 1112, 1122-1124, 1151, 1272, 1295-1299, 1301, 1317, 1343-1346, 1376-1408, 1413-1414, 1419-1420, 1425-1426, 1436-1450, 1455-1456, 1461-1462, 1476, 1632-1635, 1639, 1676, 1687-1696, 1714, 1719-1721, 1732-1740, 1747-1751, 1754, 1771-1774, 1780-1784, 1832, 1841-1865, 1877, 1883-1885, 1894-1895, 1897-1898, 1901, 1953-1956, 1984, 1991, 1997-1999, 2011-2013, 2031-2036, 2056-2058, 2102, 2110-2111
cheroot/ssl/builtin.py             90     19    79%   15-16, 20-24, 34-36, 44, 91, 103, 121, 142-147, 163, 197
cheroot/ssl/pyopenssl.py          140     20    86%   49-52, 83, 88-91, 94, 99-100, 109, 113, 133, 141, 250, 261, 276, 343
cheroot/test/conftest.py           38      1    97%   31
cheroot/test/helper.py            102     42    59%   48-76, 81-82, 87-89, 94-97, 103-111, 139, 154-155, 166
cheroot/test/test_conn.py         517     22    96%   46, 85-87, 244-249, 542-546, 570-573, 576, 671-676, 853, 956-958
cheroot/test/test_core.py         205      6    97%   37, 56, 363-366, 399
cheroot/test/test_makefile.py      31      4    87%   27-30
cheroot/test/test_server.py       126     50    60%   47, 64, 139-141, 150-160, 163-166, 172-175, 182-201, 213-235
cheroot/test/test_ssl.py          184     14    92%   86, 311, 325, 330, 337-343, 351, 361, 464, 472
cheroot/test/webtest.py           322    188    42%   56-59, 64-66, 74-81, 96-98, 127-128, 132-136, 147-152, 161, 165, 173, 198-224, 234-242, 247-292, 297, 301-306, 314-319, 323-334, 338-347, 351-358, 362-367, 371-377, 381-388, 392-397, 401-406, 410-415, 435, 447-450, 462-477, 495, 519-522, 526, 535, 541, 592-601
cheroot/testing.py                 78     11    86%   48-49, 74-75, 124-128, 132, 144-146
cheroot/workers/threadpool.py     142     44    69%   22, 25, 119, 129-133, 135-136, 180, 185-186, 199-204, 208-219, 222-225, 231-247, 269, 283, 292, 297
cheroot/wsgi.py                   162     46    72%   92, 96, 147, 160, 170-173, 179, 183, 203, 205, 215, 282-292, 316, 332-349, 353-356, 360-364, 418-424
-------------------------------------------------------------
TOTAL                            3741   1109    70%

7 files skipped due to complete coverage.
Coverage XML written to file coverage.xml

========================== slowest 10 test durations ===========================
4.12s call     cheroot/test/test_conn.py::test_HTTP11_Timeout_after_request
3.43s call     cheroot/test/test_conn.py::test_keepalive_conn_management
2.01s call     cheroot/test/test_conn.py::test_HTTP11_Timeout[True]
2.01s call     cheroot/test/test_conn.py::test_HTTP11_Timeout[False]
0.75s call     cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-True-127.0.0.1-pyopenssl]
0.70s setup    cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-False-localhost-pyopenssl]
0.70s setup    cheroot/test/test_ssl.py::test_ssl_adapters[builtin]
0.62s setup    cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_OPTIONAL-True-*.localhost-builtin]
0.62s call     cheroot/test/test_ssl.py::test_tls_client_auth[VerifyMode.CERT_REQUIRED-True-not_localhost-pyopenssl]
0.61s call     cheroot/test/test_conn.py::test_598
=========================== short test summary info ============================
XFAIL cheroot/test/test_conn.py::test_Chunked_Encoding
  Server does not correctly read trailers/ending of the previous HTTP request, thus the second request fails as the server tries to parse b'Content-Type: application/json\r\n' as a Request-Line. This results in HTTP status code 400, instead of 413Ref: https://github.com/cherrypy/cheroot/issues/69
XFAIL cheroot/test/test_conn.py::test_Chunked_Encoding
  Server does not correctly read trailers/ending of the previous HTTP request, thus the second request fails as the server tries to parse b'Content-Type: application/json\r\n' as a Request-Line. This results in HTTP status code 400, instead of 413Ref: https://github.com/cherrypy/cheroot/issues/69
XFAIL cheroot/test/test_conn.py::test_598
  Sometimes this test fails due to low timeout. Ref: https://github.com/cherrypy/cherrypy/issues/598
SKIPPED [2] /Users/jaraco/code/public/cherrypy/cheroot/cheroot/test/test_server.py:61: Darwin does not support an abstract socket namespace
SKIPPED [2] cheroot/test/test_server.py:178: Peercreds lookup does not work under macOS/BSD currently.
SKIPPED [2] cheroot/test/test_server.py:204: Peercreds lookup does not work under macOS/BSD currently.
============= 2 failed, 102 passed, 6 skipped, 3 xfailed in 14.11s =============
Coverage.py warning: No data was collected. (no-data-collected)
ERROR: InvocationError for command /Users/jaraco/code/public/cherrypy/cheroot/.tox/python/bin/pytest --testmon-off (exited with code 1)
___________________________________ summary ____________________________________
ERROR:   python: commands failed

jaraco avatar Oct 11 '19 20:10 jaraco

I've disabled those tests so that the test suite runs cleanly on my workstation. To replicate the failures, run: tox -- -k test_tls_client_auth --runxfail. If you experience the failures, we would be delighted if you could help solve the root problem.

jaraco avatar Oct 11 '19 20:10 jaraco

To replicate the failures, run: tox -- -k test_tls_client_auth --runxfail.

This doesn't fail for me.

webknjaz avatar Oct 17 '19 13:10 webknjaz

Ah.. I missed that it's macOS failure. I think this depends on the OpenSSL backend in your env.

webknjaz avatar Oct 17 '19 13:10 webknjaz

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

stale[bot] avatar Dec 16 '19 21:12 stale[bot]

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

stale[bot] avatar Feb 16 '20 12:02 stale[bot]