spark-operator
spark-operator copied to clipboard
Provide a prefix for executor's pods
HI,
I'm using spark-on-k8-operator
to run multiple spark applications on the same Kubernetes cluster at the same time.
I have a couple of spark job that runs simultaneously and I want to understand for each driver
pod, what are a the associated executor
pods.
I can see there is some uuid associated with each executor and I wanted to know if there is a way for me to provide this uuid, or to provide a better prefix (including a uuid).
Alternatively, if there is a way to connect between the uuid in the executor
pod's name to the associated driver
it would be enough for me.
Thanks.
You can use this Spark config to set a prefix for your executor pods. spark.kubernetes.executor.podNamePrefix
Spark will set a label called spark-app-id
on the drivers and executors with a random uuid. This label value will be the same for driver and its associated executors. Let me know if this helps!
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
This issue has been automatically closed because it has not had recent activity. Please comment "/reopen" to reopen it.