infinity
infinity copied to clipboard
Download all files of a thread
It would be great if we had a simple button to download all files within a thread page. A request would be sent to the server, which would respond with a zip (or tar or gz, whatever) of the thread files. That zip would be created empty when the tread starts and stored within the server. Then, when new files are uploaded, simply update the zip.
Is it a desireable feature? If so, any better solutions?
After a tiny bit of thought we could do it using JSZip to create the zip. So no double server space. The zip would be locally made.
Any input is very welcome please.
Seems like a horrible waste of server resources. I think you are underestimating how resource-intensive good compression is, unless the purpose of using a ZIP archive is not compression, but to simply provide a way to download all the files at once in one file. (Which is possible as most archive formats can be set to "storage" instead of "compression".) Nevertheless, servers would have to store and handle twice the amount of content data as they do now, as well as the fact that every time an image is added to a thread the time to create a new archive is O(n), so I am not sure whether that is such a good idea.
If users really want this, though we could always provide a client-side script that downloads all the images to a folder, or zips them (you can do that with javascript, right?), etc.
Back from research This library seems better than perfect: http://stuk.github.io/jszip/
if the JS is downloading every file individually I don't see the point of zipping however, that means everything will be downloaded into the user's default download folder since JS doesn't have the ability to save to a specific directory
this library looks like it could do it though, albeit it unfortunately requires flash
There is also a possibility to generate ZIP file remotely, on demand. Not necessarily on the same server as 8chan.
Vlad, everything you said is the same as my second comment. Even the JS library. Glad we agree. bui, the point of using zip is to download everything once to a specific directory. czaks, yea but the problem of resources is the same, its just on another server. you'd still have twice as many files and a lot more of processing going on.
Not really. The archives can be generated on demand. And compression is not that expensive, it can be also disabled/weakened for .zip files or we can generate .tars. Those can be as well cached, for eg. 2 hours.
czaks, the end result would be the same for the user if we used JSZip for the job (assuming it's JS is enabled) and it would more cost effective for us.
@VladVP
as well as the fact that every time an image is added to a thread the time to create a new archive is O(n)
This is not necessarily true. The zip format, for one, can "add" to an archive. So you wouldn't have to recreate the archive every time a new image is added to a thread, but only add the new image into the existing archive.
But nevertheless the size constraint is still there, you'd have to store almost double the data, if you didn't want to recreate the archives over and over again "on demand" as @czaks suggests.
@hugofragata not everyone has expanded every image who'd like to download all images, so you could actually reduce the amount of transferred data for people who have not looked at all images (only have the thumbs) and want "the rest" downloaded
consider firefox extension downthemall
Dollchan Extension Tools can save all images from thread as archive (tar, not zip). It also should work on 8chan.
You can use this userscript or copy that functionality as JS addon for 8chan. (8chan already has "expand all images" option. This can be another link - "save all images")
By the way - images can't be compressed well with zip, so there will be no benefit from making zip severside, this will not save bandwidth - same amount of data will be transferred.
But, if we create archive client side - some images can be taken from browser cache, so for example if you at first clicked "expand all images" and then decided to save all - everything will be taken from browser cache (and in this case archive will be created in seconds).
Problems can be in threads with heavy content. Imagine thread with 300 post where each have 5 images of HD content - this can eat all memory and crash browser.
Just a quick idea - each post has a checkbox (for deletion), this can be used to select images you want to download.