codespeed icon indicating copy to clipboard operation
codespeed copied to clipboard

Benchmark parameters

Open str4d opened this issue 8 years ago • 3 comments
trafficstars

The current codespeed model supports two kinds of groupings for benchmarks:

  • Environments (running the same benchmark under different conditions)
  • Parents (organising benchmarks into hierarchies, e.g. having a parent benchmark which itself is an aggregation of sub-benchmarks)

In my own use of codespeed, I'm finding that I want to write a particular benchmark, and then run it under various conditions within the same environment. Examples:

  • Running a benchmark with 1 vs 2 threads
  • Measuring the time to create a Zcash transaction when the wallet has 100k coins vs. 200k coins

Currently I've been implementing these as separate benchmarks, but this uses up a lot of UI space, and doesn't convey the relationship between them. Parents are not sufficient either, as there isn't a parent benchmark as such, just a group of siblings.

I think a better model for representing these would be for a Result to specify a set of Parameters for the Benchmark it is using. Moving from this:

Result ------------------------------> Benchmark

to something along these lines:

Result ------------------------------> Benchmark
   \                               /
    \--> Value -----> Parameter --/
                  /              /
    /--> Value --/              /
   /                           /
Result -----------------------/

str4d avatar Aug 24 '17 16:08 str4d

Here are the properties I'd like to see in this setup:

  • A Benchmark may have zero or more Parameters.
  • A Result must specify a value for each Parameter of its Benchmark.
  • If the Benchmark for a Result does not exist, when it is automatically created its Parameters are also created if the Result specifies them.
  • For sanity, a Result cannot automatically add a Parameter to a Benchmark; the admin must manually create the Parameter first, specifying a corresponding value for all existing Results (which implicitly were all using the same value for that previously-unspecified Parameter).

I can see two ways to implement this, with competing trade-offs:

  • Create a Parameter model with a Benchmark ForeignKey, and add a ManyToManyField(Parameter) to Result, using a ParameterValue intermediate model to store the value.
    • Makes it ~easy~ easier to enforce that a Result specifies a single value for every Parameter.
      • Django doesn't support unique_together for ManyToManyFields, so that would need to be replaced entirely with a custom uniqueness check. This would be the case for both implementation options.
    • Makes it a bit harder to collect all Results with the same value.
    • IMHO it is semantically weaker. A Parameter value is not IMHO a property of a Result; rather it is a property of the test setup (like Environment is), and thus should be a logical object that multiple Results can reference.
  • Create a Parameter model with a Benchmark ForeignKey, and a ParameterValue model with a Parameter ForeignKey. Add a ManyToManyField(ParameterValue) to Result.
    • Makes it easy to collect all Results with the same value.
    • Makes it more complex to enforce that a Result specifies a single value for every Parameter.

str4d avatar Aug 25 '17 11:08 str4d

I'm going to proceed with implementing this using a ParameterValue model, given that the upsides to the other approach are minimal.

str4d avatar Aug 25 '17 11:08 str4d

Going for the second option probably makes sense. However, this probably warrants some discussion. You brought up quite an important topic actually. Some heavy users of Codespeed were hitting it's limitations when wanting to store and compare results with many different "attributes" (your parameters) attached, and discussed possible implementations in their own companies. Have a read: https://groups.google.com/forum/#!topic/codespeed/liOwjr_uzbg

So I see this as the start of Codespeed 2.0, this is good thinking. Would you like to start a discussion thread there? We might want to outline a roadmap where Codespeed 1.0 is released, and then we restructure the project around parameters/attributes. Of course incremental approach is also an option, only before adding more and more features a bit of restructuring would do well.

tobami avatar Aug 26 '17 09:08 tobami