compiler
compiler copied to clipboard
Introduce benchmarks and tracking of performance over time
We'd like to have some performance tracking over time, similar to https://arewefastyet.com/
Probably don't bother with setting up a webpage that will hold the historic data and add to them - as a first iteration, let's just have a script (possibly in a different repo) that:
- runs the benchmarks suite on all commits in this repo
- outputs values for each commit (those can be N/A if that particular revision didn't support that Elm input and it errors out - that's to be expected)
- and then generates some graphical representation (plaintext generation of SVG? D3?
terezka/line-charts/elm-plot/ ...? Feel free to choose the easiest solution) or at least a table with the numbers.
We'll need:
- [ ] a corpus of interesting projects / modules / expressions to run the benchmarks on
- [ ] the ways to run
elm-in-elmon each input - possibly this:- [ ] parse
- [ ] parse + desugar
- [ ] parse + desugar + typecheck
- [ ] parse + desugar + typecheck + optimize
- [ ] parse + desugar + typecheck + optimize + emit
- maybe with some numerical trickery we can separate those cases from each other so that we know "the desugar phase took this long"
- [ ] a script to run benchmarks on each of these
- [ ] a script to convert the raw numbers to a graphical representation
The output files should also have some metadata about the machine they were run on (Node version, OS, CPU, memory, etc?)
We can hook you up with a new repository (say elm-in-elm/benchmarks) when needed.