github-action-benchmark icon indicating copy to clipboard operation
github-action-benchmark copied to clipboard

Crossing boundry between units in a single graph

Open jasonwilliams opened this issue 5 years ago • 5 comments

Hey @rhysd , great work on this. Is anything supposed to happen if some data crosses the boundry between say nanoseconds and microseconds?

In my benchmark here that has happened in the bottom two: https://jasonwilliams.github.io/boa/dev/bench/

It looks like its become a lot slower, but actually it sped up from 1 us to 900 ns. Unfortunately once the chart is made with a unit im guessing its fixed.

think i answered my own question https://github.com/rhysd/github-action-benchmark/blob/master/src/default_index_html.ts#L198

jasonwilliams avatar Jan 21 '20 22:01 jasonwilliams

I have an idea, which could be to use math.js to convert what ever future values back to the right unit, math.eval(901.86 ns to us) => 0.90186 us.

This could be broken down into 2 tasks:

  1. get the most common unit for that series
  2. use math.js to convert current value + unit to that unit

jasonwilliams avatar Jan 21 '20 22:01 jasonwilliams

This is a bit messy right now but ive got it working here: https://github.com/jasonwilliams/github-action-benchmark/blob/local-add-criterion-support/src/default_index_html.ts

Example of it working here (on the bottom graph highlight the third node it should be in ns): https://jasonwilliams.github.io/boa/dev/bench/

jasonwilliams avatar Jan 23 '20 17:01 jasonwilliams

Thank you for the work. Let me take a look after Criterion.rs support has landed.

rhysd avatar Jan 24 '20 04:01 rhysd

Any news on this?

Bench tools such as Catch2 can output value and range using different units, at the moment only the unit coming with the value is stored and displayed, which results in storage and display errors.

See Catch2 log: Screenshot 2024-04-15 at 12 35 29 Corresponding entry in data.js file: Screenshot 2024-04-15 at 12 35 39 This obviously displays as mean value of 9.24651 ms with deviation of ~556 ms.

wwerkk avatar Apr 15 '24 10:04 wwerkk

Normalising the units when filling the datasets, ie. to seconds, also does the trick, unless you care that much about being as accurate as possible.

I have an idea, which could be to use math.js to convert what ever future values back to the right unit, math.eval(901.86 ns to us) => 0.90186 us.

This could be broken down into 2 tasks:

1. get the most common unit for that series

2. use math.js to convert current value + unit to that unit

wwerkk avatar Apr 18 '24 16:04 wwerkk