ParlAI_SearchEngine icon indicating copy to clipboard operation
ParlAI_SearchEngine copied to clipboard

breaks after some queries

Open avacaondata opened this issue 3 years ago • 2 comments

when using this together with parlai interactive, after 2-4 conversation turns the following error appears.

requests.exceptions.ConnectionError: HTTPConnectionPool(host='0.0.0.0', port=8080): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x0000027318DBB388>: Failed to establish a new connection: [WinError 10049]

the full trace:

Traceback (most recent call last):
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\urllib3\connection.py", line 175, in _new_conn
    (self._dns_host, self.port), self.timeout, **extra_kw
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\urllib3\util\connection.py", line 95, in create_connection
    raise err
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\urllib3\util\connection.py", line 85, in create_connection
    sock.connect(sa)
OSError: [WinError 10049] La dirección solicitada no es válida en este contexto

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\urllib3\connectionpool.py", line 710, in urlopen
    chunked=chunked,
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\urllib3\connectionpool.py", line 398, in _make_request
    conn.request(method, url, **httplib_request_kw)
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\urllib3\connection.py", line 239, in request
    super(HTTPConnection, self).request(method, url, body=body, headers=headers)
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\http\client.py", line 1281, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\http\client.py", line 1327, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\http\client.py", line 1276, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\http\client.py", line 1036, in _send_output
    self.send(msg)
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\http\client.py", line 976, in send
    self.connect()
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\urllib3\connection.py", line 205, in connect
    conn = self._new_conn()
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\urllib3\connection.py", line 187, in _new_conn
    self, "Failed to establish a new connection: %s" % e
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x0000027318DBB388>: Failed to establish a new connection: [WinError 10049] La dirección solicitada no es válida en este contexto

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\requests\adapters.py", line 450, in send
    timeout=timeout
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\urllib3\connectionpool.py", line 786, in urlopen
    method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\urllib3\util\retry.py", line 592, in increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='0.0.0.0', port=8080): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x0000027318DBB388>: Failed to establish a new connection: [WinError 10049] La dirección solicitada no es válida en este contexto'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\Scripts\parlai.exe\__main__.py", line 7, in <module>
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\parlai\__main__.py", line 14, in main
    superscript_main()
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\parlai\core\script.py", line 325, in superscript_main
    return SCRIPT_REGISTRY[cmd].klass._run_from_parser_and_opt(opt, parser)
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\parlai\core\script.py", line 108, in _run_from_parser_and_opt
    return script.run()
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\parlai\scripts\interactive.py", line 118, in run
    return interactive(self.opt)
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\parlai\scripts\interactive.py", line 93, in interactive
    world.parley()
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\parlai\tasks\interactive\worlds.py", line 89, in parley
    acts[1] = agents[1].act()
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\parlai\core\torch_agent.py", line 2143, in act
    response = self.batch_act([self.observation])[0]
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\parlai\core\torch_agent.py", line 2239, in batch_act
    output = self.eval_step(batch)
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\projects\blenderbot2\agents\blenderbot2.py", line 790, in eval_step
    output = super().eval_step(batch)
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\parlai\agents\rag\rag.py", line 290, in eval_step
    output = super().eval_step(batch)
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\parlai\core\torch_generator_agent.py", line 876, in eval_step
    batch, self.beam_size, maxlen, prefix_tokens=prefix_tokens
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\parlai\agents\rag\rag.py", line 673, in _generate
    gen_outs = self._rag_generate(batch, beam_size, max_ts, prefix_tokens)
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\parlai\agents\rag\rag.py", line 713, in _rag_generate
    self, batch, beam_size, max_ts, prefix_tokens
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\parlai\core\torch_generator_agent.py", line 1094, in _generate
    encoder_states = model.encoder(*self._encoder_input(batch))
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\projects\blenderbot2\agents\modules.py", line 821, in encoder
    segments,
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\projects\blenderbot2\agents\modules.py", line 226, in encoder
    num_memory_decoder_vecs,
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\projects\blenderbot2\agents\modules.py", line 357, in retrieve_and_concat
    search_queries, query_vec, search_indices
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\projects\blenderbot2\agents\modules.py", line 519, in perform_search
    query_vec[search_indices]  # type: ignore
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\parlai\agents\rag\retrievers.py", line 411, in retrieve
    docs, scores = self.retrieve_and_score(query)
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\parlai\agents\rag\retrievers.py", line 1192, in retrieve_and_score
    search_results_batach = self.search_client.retrieve(search_queries, self.n_docs)
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\parlai\agents\rag\retrieve_api.py", line 132, in retrieve
    return [self._retrieve_single(q, num_ret) for q in queries]
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\parlai\agents\rag\retrieve_api.py", line 132, in <listcomp>
    return [self._retrieve_single(q, num_ret) for q in queries]
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\parlai\agents\rag\retrieve_api.py", line 111, in _retrieve_single
    search_server_resp = self._query_search_server(search_query, num_ret)
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\parlai\agents\rag\retrieve_api.py", line 89, in _query_search_server
    server_response = requests.post(server, data=req)
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\requests\api.py", line 117, in post
    return request('post', url, data=data, json=json, **kwargs)
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\requests\api.py", line 61, in request
    return session.request(method=method, url=url, **kwargs)
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\requests\sessions.py", line 529, in request
    resp = self.send(prep, **send_kwargs)
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\requests\sessions.py", line 645, in send
    r = adapter.send(request, **kwargs)
  File "C:\Users\Usuario\anaconda3\envs\parlaisearch\lib\site-packages\requests\adapters.py", line 519, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='0.0.0.0', port=8080): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x0000027318DBB388>: Failed to establish a new connection: [WinError 10049] La dirección solicitada no es válida en este contexto'))

@JulesGM @klshuster

avacaondata avatar Jan 08 '22 02:01 avacaondata

Observing similar behavior. Probably google throws a captcha?

Darth-Carrotpie avatar Jul 19 '22 11:07 Darth-Carrotpie

Did you find a solution?

kiminomiku avatar Sep 08 '22 05:09 kiminomiku