spark-sql-perf
spark-sql-perf copied to clipboard
point at spark master?
This is probably a really simple question. How can I point bin/run to a running Spark cluster?
I also has the same question. It seems that this can only run on localhost
Can someone provide tutorial to run on spark? And how to import TPC-DS.
Yeah, definitely
Go to this directory -> spark-sql-perf-master/src/main/scala/com/databricks/spark/sql/perf and in file RunBenchmark.scala change .setMaster("local[*]") to pint to your master. Hopefully this should work
Local: val conf = new SparkConf().setAppName("appname").setMaster("local[*]"); val sc = new SparkContext(conf); Cluster mode:
Start spark master if required worker also (eg: start-all.sh (SPARK_HOME/sbin))
By default u can get spark master url by giving http://localhost:8080 then you can see the spark master url.
val conf = new SparkConf().setAppName("appname").setMaster("spark://localhost:7077"); val sc = new SparkContext(conf);
i think this will help you run your example