Failing to download covers...again
Hi,
I'm using the 0.10.7 plugin in Calibre [Ver: 2.51.0] and it just quit downloading covers on me a couple days ago.
Log is as follows:
Running identify query with parameters: {u'identifiers': {u'comicvine': u'356772', u'comicvine-volume': u'35835', u'isbn': u'Issue ID 356772'}, u'timeout': 30, u'title': u'Uncanny X-Force #31: Final Execution', u'authors': [u'Alan Fine', u'Axel Alonso', u'Cory Petit', u'Dan Buckley', u'Dean White', u'Frank Martin Jr.', u'Jared K. Fletcher', u'Jerome Ope\xf1a', u'Joe Quesada', u'Jordan D. White', u'Nick Lowe', u'Phil Noto', u'Rick Remender']} Using plugins: Comicvine The log from individual plugins is below
****************************** Comicvine ****************************** Request extra headers: [('User-agent', 'Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)')] Found 1 results Downloading from Comicvine took 7.6779999733
Title : Uncanny X-Force #31: Final Execution Author(s) : Alan Fine & Axel Alonso & Cory Petit & Dan Buckley & Dean White & Frank Martin Jr. & Jared K. Fletcher & Jerome Opeña & Joe Quesada & Jordan D. White & Nick Lowe & Phil Noto & Rick Remender Publisher : Marvel Series : Uncanny X-Force #31 Published : 2012-09-12T04:00:00+00:00 Identifiers : comicvine-volume:35835, comicvine:356772 Comments :
The FINAL EXECUTION kicks into high gear. What is left of X-Force go up against the new Brotherhood of Evil Mutants for the last time.
Looking up Issue(356772) Adding Issue(356772) to queue Added Issue(Uncanny X-Force #31: Final Execution) to queueThe identify phase took 7.84 seconds The longest time (7.678000) was taken by: Comicvine Merging results from different sources and finding earliest publication dates from the worldcat.org service We have 1 merged results, merging took: 0.00 seconds
Starting cover download for: Uncanny X-Force #31: Final Execution Query: Uncanny X-Force #31: Final Execution [u'Alan Fine', u'Axel Alonso', u'Cory Petit', u'Dan Buckley', u'Dean White', u'Frank Martin Jr.', u'Jared K. Fletcher', u'Jerome Ope\xf1a', u'Joe Quesada', u'Jordan D. White', u'Nick Lowe', u'Phil Noto', u'Rick Remender'] {u'comicvine': u'356772', u'comicvine-volume': u'35835'}
****************************** Comicvine Covers ****************************** Request extra headers: [('User-agent', 'Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)')] Failed to download valid cover Took 3.53800010681 seconds Downloading cover from: http://static.comicvine.comhttp://static.comicvine.com/uploads/scale_large/7/71975/2586319-prev_img.jpeg Failed to download cover from: http://static.comicvine.comhttp://static.comicvine.com/uploads/scale_large/7/71975/2586319-prev_img.jpeg Traceback (most recent call last): File "calibre_plugins.comicvine.source", line 173, in download_cover File "site-packages\mechanize-0.2.5-py2.7.egg\mechanize_mechanize.py", line 199, in open_novisit File "site-packages\mechanize-0.2.5-py2.7.egg\mechanize_mechanize.py", line 230, in _mech_open File "site-packages\mechanize-0.2.5-py2.7.egg\mechanize_opener.py", line 193, in open File "site-packages\mechanize-0.2.5-py2.7.egg\mechanize_urllib2_fork.py", line 344, in _open File "site-packages\mechanize-0.2.5-py2.7.egg\mechanize_urllib2_fork.py", line 332, in _call_chain File "site-packages\mechanize-0.2.5-py2.7.egg\mechanize_urllib2_fork.py", line 1142, in http_open File "site-packages\mechanize-0.2.5-py2.7.egg\mechanize_urllib2_fork.py", line 1118, in do_open URLError: <urlopen error [Errno 11001] getaddrinfo failed>
Downloading cover from: http://static.comicvine.comhttp://static.comicvine.com/uploads/scale_medium/7/71975/2586319-prev_img.jpeg Failed to download cover from: http://static.comicvine.comhttp://static.comicvine.com/uploads/scale_medium/7/71975/2586319-prev_img.jpeg Traceback (most recent call last): File "calibre_plugins.comicvine.source", line 173, in download_cover File "site-packages\mechanize-0.2.5-py2.7.egg\mechanize_mechanize.py", line 199, in open_novisit File "site-packages\mechanize-0.2.5-py2.7.egg\mechanize_mechanize.py", line 230, in _mech_open File "site-packages\mechanize-0.2.5-py2.7.egg\mechanize_opener.py", line 193, in open File "site-packages\mechanize-0.2.5-py2.7.egg\mechanize_urllib2_fork.py", line 344, in _open File "site-packages\mechanize-0.2.5-py2.7.egg\mechanize_urllib2_fork.py", line 332, in _call_chain File "site-packages\mechanize-0.2.5-py2.7.egg\mechanize_urllib2_fork.py", line 1142, in http_open File "site-packages\mechanize-0.2.5-py2.7.egg\mechanize_urllib2_fork.py", line 1118, in do_open URLError: <urlopen error [Errno 11001] getaddrinfo failed>
Downloading cover from: http://static.comicvine.comhttp://static.comicvine.com/uploads/scale_small/7/71975/2586319-prev_img.jpeg Failed to download cover from: http://static.comicvine.comhttp://static.comicvine.com/uploads/scale_small/7/71975/2586319-prev_img.jpeg Traceback (most recent call last): File "calibre_plugins.comicvine.source", line 173, in download_cover File "site-packages\mechanize-0.2.5-py2.7.egg\mechanize_mechanize.py", line 199, in open_novisit File "site-packages\mechanize-0.2.5-py2.7.egg\mechanize_mechanize.py", line 230, in _mech_open File "site-packages\mechanize-0.2.5-py2.7.egg\mechanize_opener.py", line 193, in open File "site-packages\mechanize-0.2.5-py2.7.egg\mechanize_urllib2_fork.py", line 344, in _open File "site-packages\mechanize-0.2.5-py2.7.egg\mechanize_urllib2_fork.py", line 332, in _call_chain File "site-packages\mechanize-0.2.5-py2.7.egg\mechanize_urllib2_fork.py", line 1142, in http_open File "site-packages\mechanize-0.2.5-py2.7.egg\mechanize_urllib2_fork.py", line 1118, in do_open URLError: <urlopen error [Errno 11001] getaddrinfo failed>
Please investigate
It looks like the request is doubling the url again:
Failed to download cover from: http://static.comicvine.comhttp://static.comicvine.com/uploads/scale_small/7/71975/2586319-prev_img.jpeg
I don't remember what the resolution was, I think eventually ComicVine reverted whatever it was they did and it just started working again.
Something in this function, maybe?
def download_cover(self, log, result_queue, abort, title=None, authors=None, identifiers=None, timeout=30, get_best_cover=False): if identifiers and 'comicvine' in identifiers: for url in utils.cover_urls(identifiers['comicvine'], get_best_cover): url = 'http://static.comicvine.com' + url browser = self.browser log('Downloading cover from:', url) try: cdata = browser.open_novisit(url, timeout=timeout).read() result_queue.put((self, cdata)) except: log.exception('Failed to download cover from:', url)
Yeah, here's the commit from the last time:
timeout=30, get_best_cover=False):
if identifiers and 'comicvine' in identifiers:
for url in utils.cover_urls(identifiers['comicvine'], get_best_cover):
-
url = 'http://static.comicvine.com' + url browser = self.browser log('Downloading cover from:', url) try:
So in theory if we were to comment out that "+" line it might start working again.
Just,... can't,.... stop,... commenting,.... :)
So, yes. Go to C:\Users
Until they revert it again. Lather, rinse, repeat, wipe hands on pants.
That worked Mikey!!!
Thank you so much! I totally owe you a beer...