spark-operator
spark-operator copied to clipboard
Example is failed with error /opt/entrypoint.sh: line 40: /tmp/java_opts.txt: Permission denied
Example: https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/master/examples/spark-pi.yaml
Operator values
operatorVersion: v1beta2-1.2.1-3.0.0
imagePullSecrets:
- name: harbor-credentials
rbac:
create: true
serviceAccounts:
spark:
name: spark
controllerThread: 2
enableWebhook: true
webhookPort: 443
nodeSelector:
destiny: spark
enableMetrics: true
Driver logs
++ id -u
+ myuid=185
++ id -g
+ mygid=0
+ set +e
++ getent passwd 185
+ uidentry=
+ set -e
+ '[' -z '' ']'
+ '[' -w /etc/passwd ']'
+ echo '185:x:185:0:anonymous uid:/opt/spark:/bin/false'
+ SPARK_CLASSPATH=':/opt/spark/jars/*'
+ env
+ grep SPARK_JAVA_OPT_
+ sort -t_ -k4 -n
+ sed 's/[^=]*=\(.*\)/\1/g'
/opt/entrypoint.sh: line 40: /tmp/java_opts.txt: Permission denied
Seems related to https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/issues/1056.
Hi, Getting the same error message even though I haven't used volume mounts to /tmp. kubectl execution of Spark application gets successfully completed when run in default namespace and gives the below error when run in non-default namespace. I do have rbac.yml (having serviceaccount, clusterrole & clusterrolebinding) defined for execution in the non-default namespace.
Pasted the error msg below. Appreciate your advise in overcoming the error.
++ id -u
- myuid=0 ++ id -g
- mygid=0
- set +e ++ getent passwd 0
- uidentry=root:x:0:0:root:/root:/bin/bash
- set -e
- '[' -z root:x:0:0:root:/root:/bin/bash ']'
- SPARK_CLASSPATH=':/opt/spark/jars/*'
- env
- grep SPARK_JAVA_OPT_
- sort -t_ -k4 -n
- sed 's/[^=]=(.)/\1/g' /opt/entrypoint.sh: line 40: /tmp/java_opts.txt: Permission denied
Hi, Can someone provide updates on this please to overcome the issue of pemissions denied..
You can try the following lines if you are building your own spark image:
USER root RUN echo "" > /tmp/java_opts.txt RUN chmod 777 /tmp/java_opts.txt RUN chown root:root /tmp/java_opts.txt
I met the same problem. When I set "webhook. enable=true", I modified Spark's official Dockerfile, as follows:
Comment out two line
#ARG spark_uid=185 #USER ${spark_uid}
Hi, any solution here?