ProxyPool
ProxyPool copied to clipboard
An Efficient ProxyPool with Getter, Tester and Server
启动后输出了如下日志: * Serving Flask app "proxypool.processors.server" (lazy loading) * Environment: production WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server...
**Describe the bug** A clear and concise description of what the bug is. ++++++++++++++++++++++++++++++++++++++++++++++++++++++ Task exception was never retrieved future: Traceback (most recent call last): File "D:\ProgramData\Anaconda3\envs\env_proxy\Lib\site-packages\aiohttp\client_reqrep.py", line 965, in...

**Describe the bug** The bug is showed in the picture above, which is described as 'An error has been caught in function 'run_tester', process 'Process-1' (3296), thread 'MainThread' (12048)' I...
**Describe the bug** A clear and concise description of what the bug is. **To Reproduce** Steps to reproduce the behavior: 1. Go to '...' 2. Click on '....' 3. Scroll...
**Describe the bug** A clear and concise description of what the bug is. **To Reproduce** Steps to reproduce the behavior: 1. Go to '...' 2. Click on '....' 3. Scroll...
**Describe the bug** 安装的时候,是直接 clone + docker-compose 的没有改参数配置,然后直接调用的 random 接口。 请求的网站也是 readme 里的,似乎全都不生效是为什么呢? 这个 ip 获取的网站以及如何校验是否有效,是如何做的呢?可否有个原理说明?
 您好,我是一个新手,在跑tester的是否有上述报错,请问这是什么问题呢?该如何解决呢?
用多线程爬一些网站,当线程数开得很高的时候,相当于爬了多少次网站,就需要请求多少次自己的代理接口……所以能不能一次返回多个代理,然后我爬虫端判断是否用完,然后再次获取。