Lux.jl
Lux.jl copied to clipboard
Continuous Benchmarking
Started initial setup in #572
- [x] Generate a minimal set of benchmarks
- [x] Some basic ones are present in https://github.com/FluxML/Flux.jl/tree/master/perf
- [x] Benchmark different AD backends ~-- use
DifferentiationInterface.jlfor this part~. Main backends are Zygote.jl, Tracker.jl, Enzyme.jl, Tapir.jl and ReverseDiff.jl. Its working without DI, so I wont bother rewriting them. If anyone rewrites them I would be happy to merge it.
- [x] Comparative benchmarking against Flux.jl
- [ ] Migrate to Buildkite to allow for CUDA benchmarks
- [ ] The current benchmarks are very noisy and are running on too low threads and cores to make any reasonable sense of the numbers
- [x] Don't keep everything in memory, use
setupto create the parameters and inputs - [ ] Ordering of the benchmarks are a bit screwed up rn, we could potentially fix it by migrating to a custom build step from a easier to use (for me) plotting library