portals
portals copied to clipboard
Update benchmarks
Due to the larger restructuring, many of the benchmarks have become obsolete. In an effort to bring the benchmarking up to a good standard, we should implement some of the previous benchmarks again, and also implement some new ones.
- [ ] Microbenchmarks for stress-testing the system (see #136, currently in development).
- [ ] Any benchmarks / workloads mentioned in the Portals Onward'22 paper, to be included into the microbenchmarks.
- [ ] The NEXMark benchmark (minimum Query1-4, as presented in the paper).
- [ ] The SAVINA Actor Benchmark (for the Actor Library).
- [ ] A benchmark for the PortalsJS runtime. This could be the Microbenchmark.
- Optional: A custom benchmark for stateful serverless workloads.
- Optional: The YCSB Benchmark.
- Optional: A benchmark that runs across JS and JVM nodes (for hybrid cloud/edge executions).
- Optional: Retwis
- Optional: DeathStarBench
In addition, the benchmarks should also be tested and checked by the github actions workflows.
Notes:
- This issue replaces some previous issues on the topic that had blurred lines (#133, #137, #144, #210). Many of the benchmarks were removed in an effort to clean up the state to a fresh start. They can be found at this commit https://github.com/portals-project/portals/commit/e8e487e050d2a5f57511b57e5610f76affe4fb0f.
- PortalsJS benchmark: as a bonus, the benchmark should run in a browser page and print the results in the browser.