details about webp
would be nice to know some facts about wbp you use..
because no variable parameters is possible:
- do you use parameter -mt (multitread)
- do you use parameter -m6 (max compression)
- how i can use the near-lossless modes?
- which version of webp-codec you use?
You can check the defaults here: https://github.com/gen2brain/webp and https://github.com/gen2brain/cbconvert/blob/master/cbconvert_convert.go#L392; I am unaware of any other webp encoder. Just so you know, I do not plan to add options that only work for some specific format; multithread options are not usable and pointless here, and compression is my default (4).
By the way, you may want to try JPEG. Unlike similar software, cbconvert uses jpegli which can often produce files even smaller than WEBP.
it's strange if you don't use the optimum for one format and are intent on optimising another format. what's stopping you from using mode 6 for webp? in this case, even the suggested zero compression could deliver worse results. actually, i'm still waiting for the release of avif/av2 or a ki-based image codec to get started with the big conversion. greetings from the biggest shadow library in the world ;-)
Webp has a Method option, which is a quality/speed trade-off https://pkg.go.dev/github.com/gen2brain/webp#Options; I don't think the optimum is the slowest method (0-6). The problem is that avif has Speed in the range [0-10], and jpegxl uses Effort in the range [1-10], and then there are formats with support for Lossless that will ignore Quality. I don't see a way to expose a single option for that. Adding all options that will work only for a single format is very ugly.
once you experience what it's like to be able to move through comic books almost seamlessly at 10 to 100 times faster speed. you'll never go without zero compression again. and it only makes sense if you run the encoding efficiency at maximum. once encoded, even if it takes twice as long, decoding a 20% smaller file is another gain every time it's read. not only in terms of time but also in power consumption. actually, it doesn't need a new option.... in the case of webp, mode 6 and zero compression archive should be standard. it doesn't need a new option. i was involved in the development of the format back then (google). and can tell you what there is actually an even better option... webp can store image sequences. an intelligent logic could sort the images into similarity groups and store these groups as sequences. then webp would also be the container and it wouldn't even need an archive format, since the entire comicbook would already be summarised in a single webp file. unfortunately, this trick is not used for comicbooks and would save a good 50% on the statistical average.
but I didn't want to go that far :D
why does compression not really make sense with webp? webp already uses an internal method called Lempel-Ziv 77, which is optimised for repeating blocks. therefore webp is not really intended for further compression - and by the way, that's another reason why i found the ending cb7 quite appropriate here :D
I can then change that to 6 for Webp for now. I prefer a single option and a slider in the GUI for that, but Method, Speed, and Effort are similar options but not quite the same.
And thank you for Webp if you were involved; I recently did some work with libavif, libjxl, liheif, jpegli, and libwebp, to compile them to WASM and wrapped them to be used with Go wazero, and the libwebp was the nicest of all them, compiles cleanly, I can disable threads easily, nice and simple API, it was an absolute joy compared to all others. But since I discovered jpegli, I have preferred to use that instead. It is universally supported, and I save about 100M for a dozen comics.
Btw. I am curious how you will find the .cbt; tar doesn't have an index, and it can store files randomly, so it would have to read the whole file, but similar would be with .cbz I think; I know that unarr that I use will read the entire 7z if it is a solid archive (and many are, not sure why). Also curious if your comicbook reader can read the tar, several apps I tried didn't have any problems.
actually "solid" and "zero compression" are mutually exclusive, but with 7zip there is still the possibility that only the header/index is compressed (text). whether unrar is "stupid" with zero compression and reads the whole archive, although unnecessary, would have to be tested. at least with windows explorer and the comic viewer i use (sumatra) this does not happen, and is therefore also extremely fast.