benchmarking
benchmarking copied to clipboard
Investigate the time taken, and number of benchmarks in core
This is the starting place to begin running core benchmarks on a regular basis.
First steps: Put together a benchmark run that summarises time to run each suite as well as record number of results out of the run.
Then, at a future meeting: Look at results, decide on a subset to run on a regular basis that can fit into the time available and can be summarised appropriately.
@mhdawson i'm writing a script to go through and collect the runtime for each core benchmark - could you create me a temporary job in jenkins I can modify and use to run this?
Here you go https://ci.nodejs.org/job/benchmark-node-micro-benchmarks-gareth/
Closing as we have not made progress. @gabrielschulhof is going to write a proposal for an alternate approach.
Spreadsheet tracking relative importance of benchmarks:
https://docs.google.com/spreadsheets/d/17ey-6r_sTVYpy6Zv0n55kqkVgqRf67aaf6Ub7eebGgo/edit#gid=0
... and the survey: https://docs.google.com/forms/d/1BlVwIYZPBBmR46Ru3okvddLiPj5Zy03XPE2kyW55VzQ/edit
Here's a spreadsheet with the processed responses where the benchmarks are sorted in order of popularity:
https://docs.google.com/spreadsheets/d/1_7VrAFO8K9KdQW8qEmnKnVBb514SMRqYiitoDu_cMiM/edit#gid=979605