FrameworkBenchmarks icon indicating copy to clipboard operation
FrameworkBenchmarks copied to clipboard

Provide HREF link from results view to Github readme page for respective framework

Open sagenschneider opened this issue 5 years ago • 4 comments

Foreword: I'm interested in helping out (and in the process learn some Python). Happy to put together a PR for this issue if there is interest in it. I'm also not sure if this has been raised before (could not find any issue in search relating), so excuse if already raised.

Summary

From the results view (e.g. https://www.techempower.com/benchmarks/ ) provide ability to click on the name of the framework to then open the respective Github readme page for the framework (e.g. https://github.com/TechEmpower/FrameworkBenchmarks/tree/14eac2d833f1e45a88680427b449e720f970b4e7/frameworks/Java/officefloor ).

Background

I've become familiar to the TechEmpower benchmarks (mainly because I've submitted OfficeFloor). This involved becoming quite aware of how the benchmarks are run (well enough that I run it in my own Travis and locally on my own machine - but I make no claim to be an expert). And by the way thanks, as made performance testing OfficeFloor much easier than trying to maintain our own performance framework. So big thank you for this.

However, when I show the benchmark results to new comers, I tend to find they don't read much further than results. They do try to click on the framework name to get more information, however, as it takes them no where they just stay looking at the "summary" benchmark results. Sadly being humans, as we are, I then find them making decisions based on name and numbers next to the name. Having provided differing implementations of OfficeFloor to show trade-offs in tooling/threading choices, I tend to find this gets lost pretty quickly. Furthermore, highly optimised frameworks compared against higher level abstraction frameworks can also get lost - and I'm guessing creating the occasional "complaints" regarding how some frameworks are implemented for the tests.

Suggested Improvement

Therefore, rather than "just a name". I would like to help out making it easier for viewers of the results to get more information on the framework (and potentially how it is implemented for the test). To achieve this, I would like to:

  1. Update the tfb script to include a section in the results that captures:
    • "framework-name": "github-directory"
    • run commit version (so avoids version issues in framework directory changes)
  2. This information would then be added to the results.json file. My suggestion is a new section: { github: { officefloor: "Java/officefloor", officefloor-tpr: "Java/officefloor" } (Note: if framework from same git hub directory then all point to it)
  3. The frontend JavaScript then uses this information to open a new browser to the github page for the framework. The purpose of this link is two fold:
    • allows framework submitters to provide information to explain the framework (some frameworks I've never heard of before and had to do a bit of googling to find more information). By linking to the framework's readme page it allows information to be provided along with links to find more information.
    • allows the viewer to easily access the code to the framework to make a more "informed" decision. Many a time I read the framework's documentation and it all looks wonderful. Then I read the actual framework's test implementation and get a very different opinion. By providing the link to the framework test code, I'm hoping it easier for viewers to make more informed choices (rather than in my cynical opinion being limited by typical human slackness to do further investigation)

Anyway, if there is interest in this feature, I'm happy to do some Python coding to help enhance the tfb to output the github directories per framework and commit version of the run. Though, please excuse my PR ahead of time, as I've not done much Python.

Also, I'm happy to make the front-end changes to the JavaScript. However, I don't seem to find this code anywhere in the repository. If this is not public, I'm hoping with the added section to results.json that this is not too difficult to implement.

If this all makes sense and there is interest please let me know - and I'll get started. Or if you feel you can do quicker (and definitely better) than me, happy for you take on. Just let me know.

sagenschneider avatar Jan 03 '19 03:01 sagenschneider

Thanks for the suggestion @sagenschneider! This is actually something we've been meaning to do for some time but it keeps getting lost in the shuffle. This should be pretty easy to add with the information we already have. We take in a project name in the toolset, that's the name of the sub directory. Combined with the language from the benchmark_config.json we have enough information to create these links. Now that you've opened the issue, I'll see if I can move this along. Thanks :)

NateBrady23 avatar Jan 03 '19 17:01 NateBrady23

Thanks, and to be pedantic, I'm assuming you also will also use the git information for the URLs, e.g. recent continuous benchmark run:

"git": { "commitId": "802ccc0d2c7d59a495fa48594d96b8948561e1ef", "repositoryUrl": "https://github.com/TechEmpower/FrameworkBenchmarks.git", ... },

Including this will allow also seeing what code was actually run for the benchmark. This especially becomes useful when showing round 18 then round 19 results, as can review the code for each run to compare why there might be changes in performance by the framework - e.g. to apply to your own use of the said framework. Plus will also handle framework directory name changes and the like for viewing code of historic runs.

Side benefit, is it will also provide easy means to show the code for each continuous integration run for contributors to know when their changes got incorporated. I'm assuming once added that it will also work for the continuous integration results too :)

Thanks for taking this on. Do speak up if want help. Otherwise, I'll get back to making OfficeFloor faster :+1:

sagenschneider avatar Jan 04 '19 04:01 sagenschneider

+1 there. When I tried to find which exact version of framework used I wasted 15 mins w/o success then droped that idea. Framework versions info completely not transparent but at the same time is quite significant. May be I just missed the right link, then I advise to make it more conspicuous.

Arkemlar avatar Feb 06 '19 20:02 Arkemlar

@Arkemlar While we work on improving this, is there a framework version I can help you find for a particular round/test?

NateBrady23 avatar Feb 06 '19 20:02 NateBrady23