infinity icon indicating copy to clipboard operation
infinity copied to clipboard

Download all files of a thread

Open hugofragata opened this issue 10 years ago • 14 comments

It would be great if we had a simple button to download all files within a thread page. A request would be sent to the server, which would respond with a zip (or tar or gz, whatever) of the thread files. That zip would be created empty when the tread starts and stored within the server. Then, when new files are uploaded, simply update the zip.

Is it a desireable feature? If so, any better solutions?

--- Want to back this issue? **[Post a bounty on it!](https://www.bountysource.com/issues/5037019-download-all-files-of-a-thread?utm_campaign=plugin&utm_content=tracker%2F6417251&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F6417251&utm_medium=issues&utm_source=github).

hugofragata avatar Oct 01 '14 22:10 hugofragata

After a tiny bit of thought we could do it using JSZip to create the zip. So no double server space. The zip would be locally made.

Any input is very welcome please.

hugofragata avatar Oct 01 '14 22:10 hugofragata

Seems like a horrible waste of server resources. I think you are underestimating how resource-intensive good compression is, unless the purpose of using a ZIP archive is not compression, but to simply provide a way to download all the files at once in one file. (Which is possible as most archive formats can be set to "storage" instead of "compression".) Nevertheless, servers would have to store and handle twice the amount of content data as they do now, as well as the fact that every time an image is added to a thread the time to create a new archive is O(n), so I am not sure whether that is such a good idea.

aponigricon avatar Oct 02 '14 07:10 aponigricon

If users really want this, though we could always provide a client-side script that downloads all the images to a folder, or zips them (you can do that with javascript, right?), etc.

aponigricon avatar Oct 02 '14 07:10 aponigricon

Back from research This library seems better than perfect: http://stuk.github.io/jszip/

aponigricon avatar Oct 02 '14 07:10 aponigricon

if the JS is downloading every file individually I don't see the point of zipping however, that means everything will be downloaded into the user's default download folder since JS doesn't have the ability to save to a specific directory

this library looks like it could do it though, albeit it unfortunately requires flash

bui avatar Oct 02 '14 08:10 bui

There is also a possibility to generate ZIP file remotely, on demand. Not necessarily on the same server as 8chan.

czaks avatar Oct 02 '14 10:10 czaks

Vlad, everything you said is the same as my second comment. Even the JS library. Glad we agree. bui, the point of using zip is to download everything once to a specific directory. czaks, yea but the problem of resources is the same, its just on another server. you'd still have twice as many files and a lot more of processing going on.

hugofragata avatar Oct 02 '14 12:10 hugofragata

Not really. The archives can be generated on demand. And compression is not that expensive, it can be also disabled/weakened for .zip files or we can generate .tars. Those can be as well cached, for eg. 2 hours.

czaks avatar Oct 02 '14 13:10 czaks

czaks, the end result would be the same for the user if we used JSZip for the job (assuming it's JS is enabled) and it would more cost effective for us.

hugofragata avatar Oct 02 '14 18:10 hugofragata

@VladVP

as well as the fact that every time an image is added to a thread the time to create a new archive is O(n)

This is not necessarily true. The zip format, for one, can "add" to an archive. So you wouldn't have to recreate the archive every time a new image is added to a thread, but only add the new image into the existing archive.

But nevertheless the size constraint is still there, you'd have to store almost double the data, if you didn't want to recreate the archives over and over again "on demand" as @czaks suggests.

@hugofragata not everyone has expanded every image who'd like to download all images, so you could actually reduce the amount of transferred data for people who have not looked at all images (only have the thumbs) and want "the rest" downloaded

marcules avatar Oct 18 '14 21:10 marcules

consider firefox extension downthemall

dudeFortune avatar Jan 26 '15 09:01 dudeFortune

Dollchan Extension Tools can save all images from thread as archive (tar, not zip). It also should work on 8chan.

You can use this userscript or copy that functionality as JS addon for 8chan. (8chan already has "expand all images" option. This can be another link - "save all images")

desudesutalk avatar Jan 26 '15 10:01 desudesutalk

By the way - images can't be compressed well with zip, so there will be no benefit from making zip severside, this will not save bandwidth - same amount of data will be transferred.

But, if we create archive client side - some images can be taken from browser cache, so for example if you at first clicked "expand all images" and then decided to save all - everything will be taken from browser cache (and in this case archive will be created in seconds).

Problems can be in threads with heavy content. Imagine thread with 300 post where each have 5 images of HD content - this can eat all memory and crash browser.

desudesutalk avatar Jan 26 '15 10:01 desudesutalk

Just a quick idea - each post has a checkbox (for deletion), this can be used to select images you want to download.

desudesutalk avatar Jan 26 '15 10:01 desudesutalk