spark-sql-perf icon indicating copy to clipboard operation
spark-sql-perf copied to clipboard

Results 54 spark-sql-perf issues
Sort by recently updated
recently updated
newest added

Hello, does this benchmark provide fine-grained statistics and measures the execution time for every node in a query plan for spark SQL library? thank you for your help. best regards

Hi, I'm using spark 1.6.0 and I want to run the benchmark. However, I need first to setup the benchmark (I guess). In the tutorial it's written that we have...

Hello everyone, I ran TPC-DS Benchmark for Hawq & SparkSQL on Hadoop. Could you please help me verify the numbers? Ref: https://github.com/pivotalguru/TPC-DS https://github.com/databricks/spark-sql-perf A sub-set of 19 queries were executed...

Hello, I got the following error when trying to build the spark-sql-perf package for Spark 2.1.0. Did anyone see this before and any idea to fix it? Thanks Steps: git...

Hi, on Spark 2.3.2: ``` scala> tables.genData( | location = rootDir, | format = format, | overwrite = true, // overwrite the data that is already there | partitionTables =...

Hi experts @davies Now i am using the spark-sql-perf to generate TPC-DS 1TB data with enabling partitionTables like tables.genData("hdfs://ip:8020/tpctest", "parquet", true, true, false, false, false) . But found some of...

hi: I am facing below issues, when I am trying to run this code. For this command tables.createExternalTables("file:///home/tpctest/", "parquet", "mydata", false) java.lang.RuntimeException: [1.1] failure: ``with'' expected but identifier CREATE found...

Hi I am facing below issues, when I am trying to run this code. For this command 1) val experiment = tpcds.runExperiment(tpcds.interactiveQueries) WARN TaskSetManager: Stage 268 contains a task of...

## What's done in this PR ? - OneHotEncoderEstimator benchmark added. - Dependency changed to spark-2.3.0

Hi All, I am new to Spark and Scala. I have the source code for Spark SQL Performance Tests and dsdgen . Can anyone tell me how to proceed next...