hyperfine icon indicating copy to clipboard operation
hyperfine copied to clipboard

Support changing the default of "it will measure for at least 3 seconds"

Open RalfJung opened this issue 3 years ago • 2 comments

It would be nice if one could change the default of "it will measure for at least 3 seconds"; e.g. I might be fine with 30s of measurement for tests that usually take around 2-5s. Being able to specify the time rather than having to use -m is nice because my level of patience to wait for results is mostly constant, so I am fine with more runs when the individual runs are faster.

Looks like this is where the time is currently hard-coded: https://github.com/sharkdp/hyperfine/blob/3df63ba2bb14e75775e0e1fae744043e7ad79a5f/src/options.rs#L208

RalfJung avatar Jul 04 '22 19:07 RalfJung

Thank you for your feedback.

I certainly agree that having a hardcoded default is not the best option here. However, I would like to avoid having too many (possibly conflicting) command-line options to control the number of runs. Mostly because I think it would be difficult to explain to users.

So before we decide to integrate this: what would be possible alternatives to the introduction of a new --min-benchmarking-time <secs> option? For example, is the "min. benchmarking time" strategy reasonable at all? Maybe something like a confidence-based min. benchmarking time would be more suitable as a default? (#523)

Contrarily, are there other use cases for the suggested option? Maybe someone is aware of background interference of their benchmarks on a timescale of e.g. a few minutes and would therefore like to set --min-benchmarking-time=10min.

sharkdp avatar Jul 11 '22 20:07 sharkdp

confidence-based min. benchmarking time

That also sounds pretty awesome, yes.

But when using benchmarks while working on a feature or optimization, usually the constant is the amount of time I am willing to wait for the benchmarks to complete in each edit-compile-bench cycle. I can do more extensive measurements at the end, but I just need a reasonably good quick feedback during development. So using something time-based makes a lot of sense for that. I might call 5 runs "good enough" but if I can have 12 runs in 10s then I'll take those.

This came up for me because I am running a bunch of benchmarks to cover different workloads, and they take different amounts of time each. I don't want to tweak the run counts for each of them separately.

PS: By the way, thank you so much for this amazing tool. :)

RalfJung avatar Jul 11 '22 20:07 RalfJung

Given how easy this should be to implement, I'm inclined to just add this as a new option, but maybe mark it "hidden" for now... until we are sure that we like the "CLI design". This would hide the option from the --help text but still make it usable. An experimental feature, if you will.

sharkdp avatar Aug 15 '22 19:08 sharkdp

@sharkdp Are we looking to add this (by default hidden)? Would like to take a stab at it if you don't mind !

udsamani avatar Aug 28 '22 21:08 udsamani

That would be great!

sharkdp avatar Aug 29 '22 05:08 sharkdp

@sharkdp Just confirming are we planning to go with confidence based benchmarking or just introduce a parameter for existing but just keep it hidden ?

udsamani avatar Aug 30 '22 10:08 udsamani

@sharkdp Just confirming are we planning to go with confidence based benchmarking or just introduce a parameter for existing but just keep it hidden ?

Both :smile:

sharkdp avatar Sep 02 '22 06:09 sharkdp

@udsamani Sorry that I jumped in here, but I wanted to include that in the release I made yesterday. There are a lot of other issues on this tracker if you want to work on hyperfine!

sharkdp avatar Sep 08 '22 18:09 sharkdp