spark-operator icon indicating copy to clipboard operation
spark-operator copied to clipboard

executor not running

Open wangyhwyh753 opened this issue 5 years ago • 3 comments

I have installed operator.when I run examples spark-py-pi.yaml,I find that driver can be launched and running,but executor pod can't be found. the log of driver shows that: `++ id -u

  • myuid=0 ++ id -g
  • mygid=0
  • set +e ++ getent passwd 0
  • uidentry=root:x:0:0:root:/root:/bin/ash
  • set -e
  • echo 0
  • echo 0
  • echo root:x:0:0:root:/root:/bin/ash 0 0 root:x:0:0:root:/root:/bin/ash
  • [[ -z root:x:0:0:root:/root:/bin/ash ]]
  • exec /sbin/tini -s -- /usr/bin/spark-operator driver-py --properties-file /opt/spark/conf/spark.properties --class org.apache.spark.deploy.PythonRunner ` can somebody help me with this question?

wangyhwyh753 avatar Feb 04 '20 10:02 wangyhwyh753

Yep, I have the exact same problem. I'm using the latest spark-operator (v1beta2-1.2.0 with Spark 3.0.0) and trying to run the py example - executor is defined but does not get created in the pod - can anyone help with this? @liyinan926 ?

samirmajen avatar Oct 02 '20 13:10 samirmajen

Has this been fixed yet? I only have a driver pod running when i try the example spark-py-pi.yaml spark-operator (v1beta2 with spark 3.1.1.)

flowy0 avatar Apr 13 '22 04:04 flowy0

I have the exact same issue, except from running with custom image and using scala example. From other posts I found that the normal driver log should contain spark-submit command, which apparently does not show here.

for example the normal log would contain something like this: exec /usr/bin/tini -s -- /opt/spark/bin/spark-submit --conf spark.driver.bindAddress=172.16.24.*** --deploy-mode client --properties-file /opt/spark/conf/spark.properties --class **** local:///opt/spark/work-dir/***.jar

LinzeSu avatar Jul 21 '22 06:07 LinzeSu