Upload --print_bench
The big idea: we want a way to compare our detailed bench results in a reproducible way. We can augment barretenberg/cpp/scripts/ci_benchmark_ivc_flows.sh to capture --print_bench (or, better, the raw data that is used to render it).
When running with CI=1, automatically upload benchmark results to https://aztecprotocol.github.io/aztec-packages/bench/barretenberg-breakdowns (shallow clone gh-pages, write to bench/barrenberg-breadowns folder and push) from barretenberg/cpp/scripts/ci_benchmark_ivc_flows.sh
Acceptance Criteria
- [ ] When
CI=1is set and benchmarks are run, the output is captured and pushed as above - [ ] Results are uploaded to the GitHub Pages location specified above
- [ ] Each commit gets a single json file containing all benchmark results for native code. it should be a serialized form of our aggregate() return value perhaps, and then reconstructed on JS side.
- [ ] The json file should have a serialized form of
AggregateDatawith a key for each bench label e.g. e.g. deploy_ecdsar1+sponsored_fpc. We capture native timings here, we can mark them as such.- We'd love to have this for wasm code - as a stretch goal it'd be great to make a macro BB_BENCH_ENABLE_WASM("...") that is NOT disabled in WASM and see if we can't just get the major numbers.
- [ ] Handle concurrent pushes appropriately (multiple CI runs shouldn't overwrite each other, you will need to capture failures in git pushes and retry with some limit)
- [ ] Make a working index.html. Idea: give the current C++ code to Claude and ask it to make an html version that can be compared to a baseline. I had good luck doing similar with JS.
Each commit gets a single json file containing all benchmark results for native code.
Is it necessary to consolidate to a single json file for each commit? ci_benchmark_ivc_flows.sh currently runs each example circuit separately so it would be much easier to have a json for each run.
We can do that if we can compare commits efficiently. There's various ways we can do achieve this
What I'm trying to do in https://github.com/AztecProtocol/aztec-packages/pull/17871 is to have a folder for each example flow and within each, have a json for each commit.
Previous attempts of uploading every flow run resulted in time outs. https://github.com/AztecProtocol/aztec-packages/pull/18422 attemps to keep only one flow run for now. It may also be possible to host the data and frontend via the python app on https://ci.aztec-labs.com.
https://github.com/AztecProtocol/iac/pull/25#discussion_r2539437752 (show available flows) to be addressed in a follow-up.
I need to re-enable log transferring first so there will be flows available.
The app has been deployed. I realised I haven't updated the URL path http://ci.aztec-labs.com/breakdown-viewer. It will be in the follow-up.
This issue was automatically closed because it was referenced in AztecProtocol/aztec-packages PR #18511 which has been merged to the default branch.