reassure icon indicating copy to clipboard operation
reassure copied to clipboard

[FEATURE] capture branch/commit metadata in perf results file

Open mdjastrzebski opened this issue 2 years ago • 6 comments

Is your feature request related to a problem? Please describe. Currently our current.perf and baseline.perf contain only performance results. Hence having such files does not allow you to tell on which code version were these results gathered. Having that info would tag our measurements with code version.

Describe the solution you'd like

  • First line of .perf file could contain { metadata: { branch: 'branchname', hash: 'commitHash' }} as first line in the file that would not contain regular performance measurement entries but only metadata related to the measurements.
  • Since do not to want to directly integrate reassure measure which creates these files with any version control system, these could be passed as runtime flags, eg. reassure measure --branch branchname --hash commitHash.
  • User would pass this data from his reassure-tests.sh script, on CI and perhaps locally

Describe alternatives you've considered None

Additional context None

mdjastrzebski avatar Jun 23 '22 08:06 mdjastrzebski

Hi all, Will like to take a shot at this. Please let me know if I can proceed.

ShaswatPrabhat avatar Aug 18 '22 05:08 ShaswatPrabhat

Ok, go ahead @ShaswatPrabhat

mdjastrzebski avatar Aug 18 '22 06:08 mdjastrzebski

@ShaswatPrabhat are you still interested in contributing this PR?

mdjastrzebski avatar Sep 19 '22 13:09 mdjastrzebski

Apologies, been facing some health issues. Willl not be able to contribute to this right now.

ShaswatPrabhat avatar Sep 19 '22 14:09 ShaswatPrabhat

@ShaswatPrabhat sorry to hear that. Get well soon!

thymikee avatar Sep 19 '22 14:09 thymikee

@ShaswatPrabhat get well soon!

mdjastrzebski avatar Sep 19 '22 20:09 mdjastrzebski

Hi, Back and looking to contribute again

Are we expecting both these options to be mandatory from the CLI ? Should we default to some values if the commit and branch are not present?

Thanks,

Shaswat

ShaswatPrabhat avatar Oct 01 '22 11:10 ShaswatPrabhat

@ShaswatPrabhat all of these additional metadata values should be optional. I think that in terms of CLI we could use following optional arguments:

yarn reassure --branch [branchName] --commit-hash [commitHash]

With current.perf file containing following fields in the special purpose first line:

{ "metadata": { "branch": "branchName", "commitHash": "commitHash" }}

This will also require that compare command is able to read these values from the first line of .perf file, and do not treat it as measurement results.

As a final step these values should be rendered in JSON and MD output if present in the .perf file.

We can contemplate automatic capturing these values from the environment if we detect git repo, but I would rather postpone it to a subsequent PR.

mdjastrzebski avatar Oct 01 '22 13:10 mdjastrzebski

And just one more point: both baseline.perf and current.perf might have this metadata, example comparing two hashes from two different branches. Then this should be a comparable entity and be listed in both the .json and .md files as such.

Please correct me if I am wrong.

ShaswatPrabhat avatar Oct 01 '22 14:10 ShaswatPrabhat

Yes, good point. Both baseline and current measurements will have their own metadata, which will help to identify against which codebase version they were captured.

We will need to add it to CompareResult which is also the data structure for JSON output:

export type CompareResult = {
  significant: CompareEntry[];
  meaningless: CompareEntry[];
  countChanged: CompareEntry[];
  added: AddedEntry[];
  removed: RemovedEntry[];
  errors: string[];
  warnings: string[];
};

I think we could have following structure for that:

interface MeasurementMetadata {
  branch?: string;
  commitHash?: string;
}

export type CompareResult = {
  baseline: MeasurementMetadata;
  current: MeasurementMetadata;
  // rest of the fields
};

For MD file we can put this at the top of the performance report:

function buildMetadata(name: string, metadata?: MeasurementMetadata) {
  if (metadata.branch && metadata.commitHash) {
     return `**${name}**: \`${metadata.branch}\` (\`${metadata.commitHash}\`)`;
  }

  if (metadata.branch) {
    return `**\`${name}**\`: ${\`metadata.branch\`}`;
  }

  if (metadata.branch) {
    return `**${name}**: \`${metadata.commitHash}\``;
  }

  return `**${name}**: missing metadata`;
}


function buildMarkdown(data: CompareResult) {
  let result = headers.h1('Performance Comparison Report');

  result += `\n${buildMetadata("Current", data.current)}`;
  result += `\n${buildMetadata("Baseline", data.baseline)}`;

  if (data.errors?.length) {
    result += `\n\n${headers.h3('Errors')}\n`;
    data.errors.forEach((message) => {
      result += ` 1. 🛑 ${message}\n`;
    });
  }

If you find naming or output format inadequate pls feel free to suggest a better solution.

mdjastrzebski avatar Oct 01 '22 15:10 mdjastrzebski

Please find the PR

ShaswatPrabhat avatar Oct 02 '22 06:10 ShaswatPrabhat

@ShaswatPrabhat thanks for contributing this feature. Would you fancy going a step further and doing #207?

mdjastrzebski avatar Oct 03 '22 22:10 mdjastrzebski