py_ecc icon indicating copy to clipboard operation
py_ecc copied to clipboard

Benchmarking the Library

Open Bhargavasomu opened this issue 6 years ago • 2 comments

What is wrong?

I personally think that it would be better if there were some benchmarks added to this library, so that to make sure that we are not altering any optimizations made etc. while adding new functionalities. Further this is a pretty primitive module where I believe speed matters.

How can it be fixed

I think we could something similar to what we are doing here

Bhargavasomu avatar Feb 05 '19 10:02 Bhargavasomu

@hwwhww @ChihChengLiang do you have any suggestions for the Benchmarking Tests. Right now, all I can think of regarding benchmarking is Pairing. Any new tests would be helpful.

/cc @pipermerriam @carver

Bhargavasomu avatar Feb 26 '19 16:02 Bhargavasomu

Some idea I can come up with now. Most of these can only start after the migration of BLS aggregation API from Trinity to here.

Benchmark that focuses on the bottleneck

This protects the most important concerns, so we don't worry about breaking things.

Example:

  • Verify 2/3 of 312,500 validators' signature aggregation.
  • Verify a crosslink.

References for signature aggregation and crosslink calculation https://ethresear.ch/t/pragmatic-signature-aggregation-with-bls/2105

Benchmark that gives insights on APIs

This helps the calculation to use cases more straight forward.

Example:

  • sign
  • verify
  • verify_multiple
  • etc...

Benchmark that gives insights on units

So we can be more sensitive to the performance change.

Example: This page shows how performance changes for each operation over time. https://speed.z.cash/timeline/#/?exe=1,2&base=1+9&ben=bls12_381::ec::g1::bench_g1_add_assign&env=1&revs=50&equid=off&quarts=on&extr=on

ChihChengLiang avatar Feb 26 '19 17:02 ChihChengLiang