DC/OS cannot submit spark job and Jetty error happened
After I install spark on DC/OS, i submit a job request, and command as bleow:
dcos spark run --submit-args='--class com.nlabs.test.ScalaPI https://github.com/lihengzkj/spark-demo/raw/master/scalapractice_2.11-1.0.jar 30'
But I met errors as below:
azureuser@dcos-master-01234567-0:~$ dcos spark run --submit-args='--class com.nlabs.test.ScalaPI https://github.com/lihengzkj/spark-demo/raw/master/scalapractice_2.11-1.0.jar 30'
127.0.0.1 - - [04/May/2017 08:42:59] "POST /v1/submissions/create HTTP/1.1" 500 -
Spark submit failed:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/05/04 08:42:59 INFO RestSubmissionClient: Submitting a request to launch an application in mesos://localhost:36880.
Exception in thread "main" org.apache.spark.deploy.rest.SubmitRestProtocolException: Malformed response received from server
at org.apache.spark.deploy.rest.RestSubmissionClient.readResponse(RestSubmissionClient.scala:299)
at org.apache.spark.deploy.rest.RestSubmissionClient.org$apache$spark$deploy$rest$RestSubmissionClient$$postJson(RestSubmissionClient.scala:255)
at org.apache.spark.deploy.rest.RestSubmissionClient$$anonfun$createSubmission$3.apply(RestSubmissionClient.scala:95)
at org.apache.spark.deploy.rest.RestSubmissionClient$$anonfun$createSubmission$3.apply(RestSubmissionClient.scala:91)
at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
at org.apache.spark.deploy.rest.RestSubmissionClient.createSubmission(RestSubmissionClient.scala:91)
at org.apache.spark.deploy.rest.RestSubmissionClient$.run(RestSubmissionClient.scala:450)
at org.apache.spark.deploy.rest.RestSubmissionClient$.main(RestSubmissionClient.scala:463)
at org.apache.spark.deploy.rest.RestSubmissionClient.main(RestSubmissionClient.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:793)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:214)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:128)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: com.fasterxml.jackson.core.JsonParseException: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
at [Source:
HTTP ERROR: 500
Problem accessing /v1/submissions/create. Reason:
java.lang.NullPointerException
Powered by Jetty:// ; line: 1, column: 2] at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1581) at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:533) at com.fasterxml.jackson.core.base.ParserMinimalBase._reportUnexpectedChar(ParserMinimalBase.java:462) at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._handleOddValue(ReaderBasedJsonParser.java:1624) at com.fasterxml.jackson.core.json.ReaderBasedJsonParser.nextToken(ReaderBasedJsonParser.java:689) at com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:3776) at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3721) at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2726) at org.json4s.jackson.JsonMethods$class.parse(JsonMethods.scala:20) at org.json4s.jackson.JsonMethods$.parse(JsonMethods.scala:50) at org.apache.spark.deploy.rest.SubmitRestProtocolMessage$.parseAction(SubmitRestProtocolMessage.scala:112) at org.apache.spark.deploy.rest.SubmitRestProtocolMessage$.fromJson(SubmitRestProtocolMessage.scala:130) at org.apache.spark.deploy.rest.RestSubmissionClient$$anonfun$1.apply(RestSubmissionClient.scala:278) at org.apache.spark.deploy.rest.RestSubmissionClient$$anonfun$1.apply(RestSubmissionClient.scala:265) at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) at scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
It looks DD/OS submit command cannot works, cannot submit job to spark. anyone can guide me to solve this issue? I even don't know how and where to find and handle this issue. thanks a lot.
Same issue with DCOS 1.9 Any news ?
Can you please paste the dispatcher logs?
You mean, DNS dispatcher ?
No, the Spark Dispatcher, which is what you install when running dcos package install spark. It's the server you submit your spark job to, and is what is throwing this exception. More info: https://docs.mesosphere.com/service-docs/spark/v1.0.9-2.1.0-1/troubleshooting/
In the spark log, I see just before the NPE :
17/05/30 21:10:17 ERROR MesosClusterScheduler: Error received: Framework has been removed I0530 21:10:17.716290 81 sched.cpp:1217] Aborting framework '93bd0f2b-381d-46a3-a52b-dca47e16b81c-0002' 17/05/30 21:10:17 INFO MesosClusterScheduler: driver.run() returned with code DRIVER_ABORTED 17/05/30 21:10:17 INFO Server: Started @3094ms 17/05/30 21:10:17 INFO Utils: Successfully started service on port 13378. 17/05/30 21:10:17 INFO MesosRestServer: Started REST server for submitting applications on port 13378 17/05/30 21:37:41 INFO MesosClusterScheduler: Reviving Offers. 17/05/30 21:37:41 WARN ServletHandler: /v1/submissions/create java.lang.NullPointerException
Is this the main issue ?
Yea, it looks like you likely didn't do a complete uninstall the last time you uninstalled spark: https://docs.mesosphere.com/service-docs/spark/v1.0.9-2.1.0-1/uninstall/
Admittedly, this can be surprising. We're looking to automate this uninstall step soon.
Ok, I think it was the same issue at the first fresh install .... maybe not, can't remember. I'll reset all the stack. Thanks @mgummelt .