spark-sql-perf
spark-sql-perf copied to clipboard
suitable exector-memory for spark-sql-perf testing
Is it okay to configure "executor-memory=1G" ? I am running spark querying tpcds.tpcds2_4Queries(q1-q99) testing on 100G data on kubernetes cluster. I want to find out the most suitable executor-memory for these queries.