barretenberg icon indicating copy to clipboard operation
barretenberg copied to clipboard

Upload --print_bench

Open ludamad opened this issue 2 months ago • 8 comments

The big idea: we want a way to compare our detailed bench results in a reproducible way. We can augment barretenberg/cpp/scripts/ci_benchmark_ivc_flows.sh to capture --print_bench (or, better, the raw data that is used to render it).

When running with CI=1, automatically upload benchmark results to https://aztecprotocol.github.io/aztec-packages/bench/barretenberg-breakdowns (shallow clone gh-pages, write to bench/barrenberg-breadowns folder and push) from barretenberg/cpp/scripts/ci_benchmark_ivc_flows.sh

Acceptance Criteria

  • [ ] When CI=1 is set and benchmarks are run, the output is captured and pushed as above
  • [ ] Results are uploaded to the GitHub Pages location specified above
  • [ ] Each commit gets a single json file containing all benchmark results for native code. it should be a serialized form of our aggregate() return value perhaps, and then reconstructed on JS side.
  • [ ] The json file should have a serialized form of AggregateData with a key for each bench label e.g. e.g. deploy_ecdsar1+sponsored_fpc. We capture native timings here, we can mark them as such.
    • We'd love to have this for wasm code - as a stretch goal it'd be great to make a macro BB_BENCH_ENABLE_WASM("...") that is NOT disabled in WASM and see if we can't just get the major numbers.
  • [ ] Handle concurrent pushes appropriately (multiple CI runs shouldn't overwrite each other, you will need to capture failures in git pushes and retry with some limit)
  • [ ] Make a working index.html. Idea: give the current C++ code to Claude and ask it to make an html version that can be compared to a baseline. I had good luck doing similar with JS.

ludamad avatar Sep 24 '25 19:09 ludamad

Each commit gets a single json file containing all benchmark results for native code.

Is it necessary to consolidate to a single json file for each commit? ci_benchmark_ivc_flows.sh currently runs each example circuit separately so it would be much easier to have a json for each run.

johnathan79717 avatar Oct 24 '25 10:10 johnathan79717

We can do that if we can compare commits efficiently. There's various ways we can do achieve this

ludamad avatar Oct 24 '25 11:10 ludamad

What I'm trying to do in https://github.com/AztecProtocol/aztec-packages/pull/17871 is to have a folder for each example flow and within each, have a json for each commit.

johnathan79717 avatar Oct 24 '25 16:10 johnathan79717

Previous attempts of uploading every flow run resulted in time outs. https://github.com/AztecProtocol/aztec-packages/pull/18422 attemps to keep only one flow run for now. It may also be possible to host the data and frontend via the python app on https://ci.aztec-labs.com.

johnathan79717 avatar Nov 17 '25 12:11 johnathan79717

https://github.com/AztecProtocol/iac/pull/25#discussion_r2539437752 (show available flows) to be addressed in a follow-up.

johnathan79717 avatar Nov 19 '25 14:11 johnathan79717

I need to re-enable log transferring first so there will be flows available.

johnathan79717 avatar Nov 19 '25 16:11 johnathan79717

The app has been deployed. I realised I haven't updated the URL path http://ci.aztec-labs.com/breakdown-viewer. It will be in the follow-up.

johnathan79717 avatar Nov 19 '25 16:11 johnathan79717

This issue was automatically closed because it was referenced in AztecProtocol/aztec-packages PR #18511 which has been merged to the default branch.

View workflow run

AztecBot avatar Nov 20 '25 11:11 AztecBot