Dockerfile not working
When using the default spark-py-pi.yaml, it works fine. But it doesn't work if I use the image built by the Dockerfile provided.
Environment: Kubernetes 1.23.4 Docker 20.10.13 Spark 3.1.1
Following are the logs:
++ id -u
+ myuid=0
++ id -g
+ mygid=0
+ set +e
++ getent passwd 0
+ uidentry=root:x:0:0:root:/root:/bin/bash
+ set -e
+ echo 0
+ echo 0
0
0
root:x:0:0:root:/root:/bin/bash
+ echo root:x:0:0:root:/root:/bin/bash
+ [[ -z root:x:0:0:root:/root:/bin/bash ]]
+ exec /usr/bin/tini -s -- /usr/bin/spark-operator driver --properties-file /opt/spark/conf/spark.properties --class org.apache.spark.examples.SparkPi local:///spark/examples/jars/spark-examples_2.12-3.1.1.jar
W1227 07:15:38.708407 10 reflector.go:424] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:169: failed to list *v1beta2.SparkApplication: sparkapplications.sparkoperator.k8s.io is forbidden: User "system:serviceaccount:default:spark" cannot list resource "sparkapplications" in API group "sparkoperator.k8s.io" at the cluster scope
E1227 07:15:38.708493 10 reflector.go:140] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:169: Failed to watch *v1beta2.SparkApplication: failed to list *v1beta2.SparkApplication: sparkapplications.sparkoperator.k8s.io is forbidden: User "system:serviceaccount:default:spark" cannot list resource "sparkapplications" in API group "sparkoperator.k8s.io" at the cluster scope
In the error message, it says that you are using this "system:serviceaccount:default:spark" service account which doesn't have the required permission.
You can try to create a new service account with all permission and then try to harden it.
kubectl create serviceaccount <spark-editor-service-account> -ns <spark-namespace>
kubectl create rolebinding <spark-editor-role> --clusterrole=edit --serviceaccount=<namespace>:<spark-editor-service-account>
Do replace the <> placeholder values with your required values.
Now try to use this newly created service-account.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
This issue has been automatically closed because it has not had recent activity. Please comment "/reopen" to reopen it.