unconf17 icon indicating copy to clipboard operation
unconf17 copied to clipboard

Track package performance over time

Open jimhester opened this issue 8 years ago • 7 comments

Covr and codecov.io are great for tracking code coverage during package's development. Another aspect of a package that would be useful to track is the performance of one or more benchmark functions.

This is useful for package authors to ensure they don't inadvertently introduce a performance regression when adding new features. Also useful for users to see if how much a new version improves or reduces performance. Could also running the benchmarks when a PR is submitted, to see how the changes impact current performance.

I wrote a rough example at https://github.com/jimhester/benchthat and @krlmlr has dplyr specific code to do this at https://krlmlr.github.io/dplyr.benchmark/.

Some useful features to me would be

  1. Store the results in a easy to parse file in the repository (My draft puts them in /docs/benchmarks)
  2. Helper functions that are easy to run automatically in a package's tests.
  3. Run a benchmark retroactively over the repo history.
  • Is there a peak finding algorithm / git bisect we could use to find performance breakpoints so you don't have to exhaustively benchmark each commit?
  1. Visualizing and reporting on benchmark results.

jimhester avatar Apr 24 '17 13:04 jimhester

There's also https://github.com/analyticalmonk/Rperform

noamross avatar Apr 24 '17 14:04 noamross

Rperform seems like it already does most of this, but clearly needs more exposure / use and possibly some thought into better integration into pkgdown / travis so it is more useful for PR results.

jimhester avatar Apr 24 '17 14:04 jimhester

I love this idea!

Question re: outside support:

codecov.io is to code coverage as ??? is to benchmarking

Or does this aspect have to be handled by the package described here? The display of results over time could potentially be handled in pkgdown site.

jennybc avatar Apr 25 '17 17:04 jennybc

I don't know of anything like codecov.io for tracking benchmarking over time. If there was something we could use it or maybe setup a simple service to do so.

jimhester avatar Apr 25 '17 17:04 jimhester

codecov.io is to code coverage as ??? is to benchmarking

https://github.com/tobami/codespeed is only one I know of. You need to run your own service.

Julia used to run it, I don't know if they still do.

gaborcsardi avatar Apr 25 '17 17:04 gaborcsardi

The Julia site used to be at http://speed.julialang.org/

It is gone.

gaborcsardi avatar Apr 25 '17 17:04 gaborcsardi

Seems to me that it would be logical to integrate performance testing with testthat.

jsta avatar Jun 09 '17 13:06 jsta