KernelFunctions.jl icon indicating copy to clipboard operation
KernelFunctions.jl copied to clipboard

Benchmarking!

Open theogf opened this issue 2 years ago • 1 comments

Related to #386, I would like to open the discussion for evaluating the performance of basic functions. Here are some facts/ideas.

Existing tools

  • PkgBenchmark.jl: A nice tool to create a suite of benchmarks and has functions to create nice reports on variations of performance. You can even create a markdown report which could be posted on the PR
  • NanoSoldier.jl: The tool used by Julia to evaluate the performance of the language. I don't believe this can be adapted easily to our setup, but I did not checked the details
  • Github action benchmark: Given a benchmark output will also create a report and has the possibility to directly create comments

Potential issues

  • Benchmark is highly dependent on the machine used, if we use the Github clusters, we might get a large variance in the results depending on the time of the day etc...
  • We cannot benchmark everything, which means we need to restrict ourselves to maybe the most used functions/kernels etc
  • Adding benchmarks can be a lot of work, can we find a framework where adding new tests is smooth

Other ideas

  • Not all PRs are performance-critical, we should be able to call whatever tool we use only when needed? Maybe every time for master and at will for some PRs.
  • What do we want to benchmark? Only pairwise, kernelmatrix or also the performance of the gradients on them?

theogf avatar Oct 19 '21 15:10 theogf

I additionally found https://github.com/tkf/BenchmarkCI.jl which seems to do exactly what we want!

theogf avatar Oct 19 '21 15:10 theogf