nemobis
nemobis
This should probably not be a fatal: ``` return self.request('POST', url, data=data, **kwargs) File "/home/users/federico/.local/lib/python2.7/site-packages/requests/sessions.py", line 456, in request resp = self.send(prep, **send_kwargs) File "/home/users/federico/.local/lib/python2.7/site-packages/requests/sessions.py", line 559, in send r...
http://totaldrama.wikia.com/wiki/Special:Export/Main_Page and http://adventuretime.wikia.com/wiki/Special:Export/Main_Page look ok, but I get (with current master) ``` Analysing http://adventuretime.wikia.com/api.php [...] Loading config file... Resuming previous dump process... Title list was completed in the previous session...
When this error is found, we shouldn't proceed. We should also have an option to delete (or, better, rename) the broken archive and retry the download. Perhaps we should even...
While downloading images, for each of them which gets (hopefully/allegedly) downloaded: HTTP Error 301. Redirect should happen automatically: please report this as a bug. http://bionet-skola.com/w/index.php
This wiki requires login, we should recognise it and exit cleanly (maybe also save the main page's HTML or something): ``` ######################################################################### # Downloading http://aliencity.org/w/api.php ######################################################################### Checking API... http://aliencity.org/w/api.php Traceback...
``` Analysing http://lackadaisycats.com/wiki/api.php Trying generating a new dump into a new directory... Loading page titles from namespaces = all Excluding titles from namespaces = None Sleeping... 1 seconds... Error: could...
https://github.com/WikiTeam/wikiteam/blob/ce6fbfee557582126fd4b7b8ff1653b7fc589da5/listsofwikis/mediawiki/wikia.py#L53 is too simplistic. For instance, http://hkbus.wikia.com/wiki/Special:Version points to http://s3.amazonaws.com/wikia_xml_dumps/z/zh/zhhongkongbus_pages_full.xml.gz (probably the wiki was renamed at some point?). The script concludes a dump is missing, while it's just looking in...
I'm done downloading 1306 EditThis wikis. If the list gets updated, I can dump any new wiki as well. One never knows how much time is left before it goes...
Before I forget again: download or build a test case for an invalid XML dump as Wikia makes them, that is: - sha1 tag in the wrong place (not inside...
``` python not-archived.py Traceback (most recent call last): File "not-archived.py", line 68, in [...] http://servis-it.ru/api.php 1 0 main() File "not-archived.py", line 61, in main print i[1], i[2], i[3], i[0] UnicodeEncodeError:...