ClickBench icon indicating copy to clipboard operation
ClickBench copied to clipboard

ClickBench: a Benchmark For Analytical Databases

Results 46 ClickBench issues
Sort by recently updated
recently updated
newest added

https://www.boilingdata.com/ Suggested by @danthegoodman1 in a discussion at https://clickhouse.com/slack

help wanted

Is it possible to get a larger data set, say 2TB or 5TB? Testing on a 200GB data set that is easily compressible down to 50GB with modern compression algorithms...

What is YTsaurus: https://ytsaurus.tech/ Since YTsaurus [does not have](https://github.com/ytsaurus/ytsaurus/issues/21#issuecomment-1488384350) a built-in benchmark tool, could be a good idea to add YTsaurus support to ClickBench to get a possibility to benchmark...

Would be nice to see [Quickwit](https://github.com/quickwit-oss/quickwit) comparison as well since at least ElasticSearch is supported here right now. Pinging @fulmicoton - probably you could be interested in it at least...

At the moment, [the below command](https://github.com/ClickHouse/ClickBench/blob/732c3cff405f02fcda09f908db8579ce83232939/mysql/benchmark.sh#L23) is used in order to calculate the storage size of the table in MySQL: ```bash sudo du -bcs /var/lib/mysql ``` Because of the irrelevant...

Regarding issue #92, I modified the table size calculation code in mysql-myisam and mysql folders. I can't run the benchmark for the whole dataset hence I can't modify the final...

Thank you for the contribution! Please make sure that you've updated `index.html`. In order to do this run `generate-results.sh` from the root directory of ClickBench repository.

I have run newest version of Oxla (1.20) and have created an index that exactly matches ordering of table definition in Clickhouse.

Thank you for the contribution! Please make sure that you've updated `index.html`. In order to do this run `generate-results.sh` from the root directory of ClickBench repository.