spark-operator icon indicating copy to clipboard operation
spark-operator copied to clipboard

Custom environment variables provided in Kubernetes spark job is not getting picked up

Open focode opened this issue 9 months ago • 2 comments

This is yaml of my spark job: kind: SparkApplication metadata: name: operatordc1 namespace: spark spec: type: Java mode: cluster image: "xiotxpcdevcr.azurecr.io/spark-custom:release-8.0" imagePullPolicy: Always imagePullSecrets: - mdsp-secret-spark mainClass: "org.springframework.boot.loader.JarLauncher" mainApplicationFile: "local:///opt/spark/examples/jars/operatordc1.jar" # Ensure this is the correct path within your Docker image sparkVersion: "3.4.2" restartPolicy: type: Never driver: env: - name: spring.profiles.active value: azure,secured cores: 4 coreLimit: "4000m" memory: "4096m" javaOptions: >- -Dlog4j.configuration=file:///log4j2.xml --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/jdk.internal.misc=ALL-UNNAMED -XX:+UseG1GC -XX:MaxGCPauseMillis=200 -XX:G1HeapRegionSize=32M -XX:ReservedCodeCacheSize=100M -XX:MaxMetaspaceSize=256m -XX:CompressedClassSpaceSize=256m -Xms1024m -Dlog4j.debug labels: version: "3.4.2" serviceAccount: spark executor: cores: 4 instances: 1 memory: "4096m" javaOptions: >- -Dlog4j.configuration=file:///log4j2.xml -XX:ReservedCodeCacheSize=100M -XX:MaxMetaspaceSize=256m -XX:CompressedClassSpaceSize=256m --add-opens=java.base/sun.nio.ch=ALL-UNNAMED -Dlog4j.debug labels: version: "3.4.2" serviceAccount: spark env: - name: spring.profiles.active value: "azure,secured" sparkConf: "spark.driver.userClassPathFirst": "true" "spark.executor.userClassPathFirst": "true" "spark.driver.memory": "4096m" "spark.executor.memory": "4096m" "spark.dynamicAllocation.enabled": "true"

when I am describing the pod , I am getting only spark operator provided env values :

Environment: SPARK_USER: root SPARK_APPLICATION_ID: spark-699c7647354544e293cc2c12cda9e88e SPARK_DRIVER_BIND_ADDRESS: (v1:status.podIP) SPARK_LOCAL_DIRS: /var/data/spark-c6e072fb-2e09-4a07-8c58-0365eda4f362 SPARK_CONF_DIR: /opt/spark/conf

It is missing: name: spring.profiles.active value: "azure,secured"

focode avatar May 08 '24 12:05 focode

I take it you are using the Webhook?

I've observed the same behaviour recently. I believe the Mutating webhook injects these.

What's in your Operator logs?

Are you seeing TLS handshake errors to the K8s API server?

SamBird avatar May 24 '24 20:05 SamBird

Same as Unable to assign environment variables, check your webhook first @focode

imtzer avatar Jun 03 '24 04:06 imtzer