oxipng
oxipng copied to clipboard
"Qualifying round" optimization
It would probably improve performance with very little risk of suboptimal compression, if we had the option to discard filtering strategies whose result was e.g. more than 10% and 8KiB larger than the smallest when processed with fast compression, but then use more thorough compression to make the most of a clear winner (sort of like what the Brute strategy does, but applied to the whole image) or choose among those that were within the margin of error. This is especially true when processing a lot of images in parallel, since narrowing down the options would free up threads for subsequent images. The fast compression could also be used to prioritize the most promising filter strategies when under deadline pressure in sequential mode.
It's possible this could be implemented as a variant on the fast-evaluation option, where evaluation would be used to select not only reductions but also, subsequently, filter strategies. In my app, Brute is best more often than Bigrams (it's second to None 🤣), so I'd want to have it, Bigrams and None tested with all possible reduction variants, and then have the other 7 each evaluated with whichever reduction(s) had scored best or near-best among the big 3.
I'll just note that #509 was recently merged which greatly reduces the losses from fast-evaluation. If you're wanting more speed, I'd definitely recommend using it.
Just to comment further here, this is something I'd like to pursue for a future release (though no promises as to when I'll get to it).
The idea would basically be to combine reduction and filter evaluations into one, with numReductions * numFilters fast trials. Then pick some number of top candidates to run full trials of, with thresholds dependent on the -o level.
https://github.com/shssoichiro/oxipng/pull/640#issuecomment-2273491567 §2 seems to be a similar idea.