CalibreLibgenStore
CalibreLibgenStore copied to clipboard
TimeoutError
Just installed your plugin, the search works but the download often goes in timeout
calibre, version 5.34.0 (linux, embedded-python: False)
Failed to download e-book: Failed: Downloading Il lupo della steppa (Italian, 94Kb).epub
Starting job: Downloading Il lupo della steppa (Italian, 94Kb).epub
Job: "Downloading Il lupo della steppa (Italian, 94Kb).epub" failed with error:
Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/mechanize/_urllib2_fork.py", line 1236, in do_open
h.request(str(req.get_method()), str(req.get_selector()), req.data,
File "/usr/lib/python3.9/http/client.py", line 1285, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/usr/lib/python3.9/http/client.py", line 1331, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/usr/lib/python3.9/http/client.py", line 1280, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/usr/lib/python3.9/http/client.py", line 1040, in _send_output
self.send(msg)
File "/usr/lib/python3.9/http/client.py", line 980, in send
self.connect()
File "/usr/lib/python3.9/http/client.py", line 946, in connect
self.sock = self._create_connection(
File "/usr/lib/python3.9/socket.py", line 844, in create_connection
raise err
File "/usr/lib/python3.9/socket.py", line 832, in create_connection
sock.connect(sa)
TimeoutError: [Errno 110] Connessione scaduta
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/calibre/calibre/gui2/threaded_jobs.py", line 83, in start_work
self.result = self.func(*self.args, **self.kwargs)
File "/usr/lib/calibre/calibre/gui2/ebook_download.py", line 85, in __call__
dfilename = self._download(cookie_file, url, filename, save_loc, add_to_lib, create_browser)
File "/usr/lib/calibre/calibre/gui2/ebook_download.py", line 101, in _download
return download_file(url, cookie_file, filename, create_browser=create_browser)
File "/usr/lib/calibre/calibre/gui2/ebook_download.py", line 68, in download_file
with closing(br.open(url)) as r:
File "/usr/lib/python3/dist-packages/mechanize/_mechanize.py", line 257, in open
return self._mech_open(url_or_request, data, timeout=timeout)
File "/usr/lib/python3/dist-packages/mechanize/_mechanize.py", line 287, in _mech_open
response = UserAgentBase.open(self, request, data)
File "/usr/lib/python3/dist-packages/mechanize/_opener.py", line 193, in open
response = urlopen(self, req, data)
File "/usr/lib/python3/dist-packages/mechanize/_urllib2_fork.py", line 425, in _open
result = self._call_chain(self.handle_open, protocol, protocol +
File "/usr/lib/python3/dist-packages/mechanize/_urllib2_fork.py", line 414, in _call_chain
result = func(*args)
File "/usr/lib/python3/dist-packages/mechanize/_urllib2_fork.py", line 1258, in http_open
return self.do_open(HTTPConnection, req)
File "/usr/lib/python3/dist-packages/mechanize/_urllib2_fork.py", line 1240, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 110] Connessione scaduta>
Called with args: (<calibre.gui2.ui.Main object at 0x7f6b2e7ec5e0>, None, 'http://31.42.184.140/fiction/1605000/eb5e708377e7bc317ca62aa1554b1e19.epub/Hesse%2C%20Hermann%20-%20Il%20lupo%20della%20steppa.epub', 'Il lupo della steppa (Italian, 94Kb).epub', '', True, [], <bound method StorePlugin.create_browser of <calibre_plugins.libgen_fiction.LibgenStore object at 0x7f6b2b66feb0>>) {'notifications': <queue.Queue object at 0x7f6b28321a30>, 'abort': <threading.Event object at 0x7f6b22574520>, 'log': <calibre.utils.logging.GUILog object at 0x7f6b22574700>}
I noticed that even in the browser sometimes the download take an incredible amount of time, it's not slow ... the download didn't even start for a few minutes then starts and ends in less than a second.
Can you set an higher timeout or use a configurable parameter?
That sounds pretty plausible to me - however, I don't spend much time on this anymore, so it might be a while till I get aroudn to it. If you submit a PR I'd be happy to land it