curl_cffi icon indicating copy to clipboard operation
curl_cffi copied to clipboard

curl_cffi.requests.errors.RequestsError: Failed to perform, ErrCode: 56, Reason: 'Proxy CONNECT aborted'.

Open linux-dxr opened this issue 1 year ago • 3 comments

各位有遇到过这种使用代理请求然后偶尔报这种异常的么,大部分正常请求,但是偶尔报这种错,遗漏很多请求,有解决方案么 完整报错:

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\1111\.conda\envs\my_scrapy\lib\site-packages\twisted\internet\defer.py", line 1692, in _inlineCallbacks
    result = context.run(
  File "C:\Users\1111\.conda\envs\my_scrapy\lib\site-packages\twisted\python\failure.py", line 518, in throwExceptionIntoGenerator
    return g.throw(self.type, self.value, self.tb)
  File "C:\Users\1111\.conda\envs\my_scrapy\lib\site-packages\scrapy\core\downloader\middleware.py", line 42, in process_request
    response = yield deferred_from_coro(
  File "C:\Users\1111\.conda\envs\my_scrapy\lib\site-packages\twisted\internet\defer.py", line 1064, in adapt
    extracted = result.result()
  File "E:\workspace\my_scrapy\my_scrapy\middlewares\request_mode.py", line 70, in _process_request
    raise e
  File "E:\workspace\my_scrapy\my_scrapy\middlewares\request_mode.py", line 55, in _process_request
    response = await s.request(
  File "C:\Users\1111\.conda\envs\my_scrapy\lib\site-packages\curl_cffi\requests\session.py", line 934, in request
    raise RequestsError(str(e), e.code, rsp) from e
curl_cffi.requests.errors.RequestsError: Failed to perform, ErrCode: 56, Reason: 'Proxy CONNECT aborted'. This may be a libcurl error, See https://curl.se/libcurl/c/libcurl-errors.html first for more details.

linux-dxr avatar Feb 01 '24 07:02 linux-dxr

大概率是代理质量问题

perkfly avatar Feb 01 '24 08:02 perkfly

我未使用代理,我的电脑默认使用的是外网。但同情况一直有爬虫警告:[py.warnings] WARNING: C:\Users\test\AppData\Roaming\Python\Python39\site-packages\curl_cffi\aio.py:205: UserWarning: Curlm alread closed! quitting from process_data warnings.warn("Curlm alread closed! quitting from process_data")和时不时的爬虫异常:curl_cffi.requests.errors.RequestsError: Failed to perform, curl: (56) . See https://curl.se/libcurl/c/libcurl-errors.html first for more details.即无法接收网络数据。这样的情况也是我的网络质量原因吗?

bijiakunkun avatar Apr 11 '24 07:04 bijiakunkun

我未使用代理,我的电脑默认使用的是外网。但同情况一直有爬虫警告:[py.warnings] WARNING: C:\Users\test\AppData\Roaming\Python\Python39\site-packages\curl_cffi\aio.py:205: UserWarning: Curlm alread closed! quitting from process_data warnings.warn("Curlm alread closed! quitting from process_data")和时不时的爬虫异常:curl_cffi.requests.errors.RequestsError: Failed to perform, curl: (56) . See https://curl.se/libcurl/c/libcurl-errors.html first for more details.即无法接收网络数据。这样的情况也是我的网络质量原因吗?

我也有 “ UserWarning: Curlm alread closed! quitting from process_data”这样的警告,只会出现一次,后续的请求不会出现。虽然不是报错但是还是挺不舒服的。我发现只要在调用close之前await asyncio.sleep(0.8)就不会有这样的警告。你有什么解决方法了吗

wkz2003 avatar Jun 29 '24 15:06 wkz2003