Benchmark Executor performance
The DSQL executor parses the DSQLQuery object and returns a result set of keys and/or values.
We currently have no benchmark of the performance characteristics of this operation.
Requirements
- Create a benchmark with large number of keys in the store and validate query performance. It may be a good idea to check performance difference at ~1k, ~10k, ~100k keys in the store.
- benchmark time taken and memory allocs using benchmem.
- test cases should validate different kinds of queries (with/without ordering specified, with/without WHERE clause, with/without complex wildcard matchers)
HI @JyotinderSingh I can pick this up. Since I'm anyway going to dive deeper into the codebase, I can work on the test cases and benchmarking.
HI @JyotinderSingh I can pick this up. Since I'm anyway going to dive deeper into the codebase, I can work on the test cases and benchmarking.
Assigned
HI @JyotinderSingh the plan I'm thinking of is to use a go-dice client and populate keys using the client.JSONSet() function and then write the benchmarks using the different scenarios suggested above. Does this sound okay?
HI @JyotinderSingh the plan I'm thinking of is to use a
go-diceclient and populate keys using theclient.JSONSet()function and then write the benchmarks using the different scenarios suggested above. Does this sound okay?
JSON support is not yet available on master
I have covered all the cases available in the executer_test.go to run the benchmark cases with the keys 100, 1000, 10000, 100000. Here's a report.
PR: https://github.com/DiceDB/dice/pull/211
@arpitbbhayani @JyotinderSingh ^^