spark-perf
spark-perf copied to clipboard
Test did not produce expected results. Output was: spark-perf
Hi, I am using it with spark-2.0.0-bin-hadoop2.7 version but it is not working and showing following output.
Setting env var SPARK_SUBMIT_OPTS: -Dspark.storage.memoryFraction=0.66 -Dspark.serializer=org.apache.spark.serializer.JavaSerializer -Dspark.locality.wait=60000000 -Dsparkperf.commitSHA=unknown Running command: /home/shuja/Desktop/data/spark-2.0.0-bin-hadoop2.7/bin/spark-submit --class spark.perf.TestRunner --master spark://shuja:7077 --driver-memory 1g /home/shuja/Desktop/data/spark-perf-master/spark-tests/target/spark-perf-tests-assembly.jar scheduling-throughput --num-trials=10 --inter-trial-wait=3 --num-tasks=10000 --num-jobs=1 --closure-size=0 --random-seed=5 1>> results/spark_perf_output__2016-08-03_11-22-03_logs/scheduling-throughput.out 2>> results/spark_perf_output__2016-08-03_11-22-03_logs/scheduling-throughput.err
Test did not produce expected results. Output was:
Java options: -Dspark.storage.memoryFraction=0.66 -Dspark.serializer=org.apache.spark.serializer.JavaSerializer -Dspark.locality.wait=60000000 Options: scheduling-throughput --num-trials=10 --inter-trial-wait=3 --num-tasks=10000 --num-jobs=1 --closure-size=0 --random-seed=5
If you look in the results/spark_perf_output__2016-08-03_11-22-03_logs/scheduling-throughput.err
text file you will see the error output from spark. This should help you isolate your specific error.
I had similar issue, looking at .err log, I found
Exception in thread "main" java.lang.NoSuchMethodError: org.json4s.jackson.JsonMethods$.render(Lorg/json4s/JsonAST$JValue;)Lorg/json4s/JsonAST$JValue;
at spark.perf.TestRunner$.main(TestRunner.scala:47)
at spark.perf.TestRunner.main(TestRunner.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
What is missing? I am running tests on Ubuntu 16.04 / spark-2.0.0-bin-hadoop2.7 /
Has this error been resolved. I too face the same error. Can anyone help?
Use spark-perf from https://github.com/a-roberts/spark-perf for spark 2.0 or update the json4s dependency