spark-operator
spark-operator copied to clipboard
Manage SparkApplications with different Spark versions by specifying Spark version inside manifest
Hi! Is it possible to use two different Spark Operator instances to run SparkApplications with different Spark versions by specifying Spark version inside manifest? For example: We have 2 Spark Operators inside Kubernetes cluster - 1 from Cloudflow with version 2.4.5 and 1 installed manualy from official Helm repo with version 3.0.0. We deploy SparkApplications manifest, where we specify sparkVersion: "2.4.5" and it is running with Cloudflow Spark Operator, than we deploy SparkApplications manifest, where we specify sparkVersion: "3.0.0" and it is running with Spark Operator with 3.0.0 version support.