spicedb icon indicating copy to clipboard operation
spicedb copied to clipboard

Setup CI pipeline for benchmarking performance

Open jzelinskie opened this issue 4 years ago • 8 comments

This requires a few steps:

  • Converting our existing ad-hoc process for performance testing into a GitHub Action
  • Connecting GitHub Actions to dedicated hardware that can measure performance without noisy-neighbor issues
  • Formalizing the output of performance tests
  • Determining how and when these tests run
  • Determining how to produce action items from tests that are ran

jzelinskie avatar Aug 16 '21 22:08 jzelinskie

We should setup a dedicated worker for benchmarks or explore using a service like bencher to get reliable benchmarks built into our CI workflow.

jzelinskie avatar Oct 28 '21 16:10 jzelinskie

Hi, I work on bencher! If you wanted to try it out (it's free!), it should just take a few clicks to install from the GitHub marketplace We've got some more-detailed installation instructions (with pictures!) as well.

Let me know if I can help with anything :)

kirbyquerby avatar Jul 12 '22 06:07 kirbyquerby

@kirbyquerby I tried enabling it, but we use docker for integration testing in our benchmarks -- this is because we benchmark against various databases.

I see in the bencher configuration you can depend on services based on docker, but can we actually just get access to a docker socket?

jzelinskie avatar Sep 15 '22 17:09 jzelinskie

@kirbyquerby @odeke-em Are there any rate limits on bencher? I've been playing with things over at https://github.com/jzelinskie/benchpress and half of my pushes don't run. I was pulling my hair out thinking I had an invalid config, but it's not starting builds even without a config.

I'm experimenting with using docker-in-docker so that our test suite can run unmodified.

jzelinskie avatar Sep 19 '22 21:09 jzelinskie

Hey @jzelinskie, Bencher runs only when there is a code change in .go files or when there is a configuration update. The reason for that is that in a PR, lots of diverse changes can happen but shouldn't be wasting precious machine time and CPU re-benchmarking. We shall document this on https://bencher.orijtech.com/

odeke-em avatar Sep 19 '22 21:09 odeke-em

@odeke-em Thanks for the response -- that makes perfect sense.

Do you have any recommendations on how we could get access to the Docker daemon? Our tests use the Docker API to spin up/down images it needs for end-to-end tests. I've been playing with docker-in-docker, but that hasn't seemed to work.

jzelinskie avatar Sep 22 '22 02:09 jzelinskie

I'm hesitant to provide direct access to the docker daemon because we run benchmarks on bare metal. This makes it tricky to ensure the state of the machine is reset after a benchmark is run. The current setup provides a intentionally trimmed-down set of docker features to try and minimize the attack surface -- I'm not well-versed enough to understand all the potential pitfalls of allowing arbitrary access to all of docker.

docker-in-docker seems promising, though -- I'll take a look at what it would take to make it work with bencher.

kirbyquerby avatar Oct 12 '22 20:10 kirbyquerby