Maximilian Engelhardt
Maximilian Engelhardt
This is a known issue (see [README of the base image](https://github.com/panovvv/hadoop-hive-spark-docker#version-compatibility-notes)). I've managed establish a connection from spark to hive by simply upgrading to Spark 3.0.1, but I'm getting a...
I think Docker support might be helpful - especially since many Deep Learning practitioners using this might not be familiar with the Go tool chain. Imagining an executable image like...
## ✅ CML Spark configuration from the Workbench runtime (with Spark 3.2 enabled): ``` spark._sc.getConf().getAll() [('spark.eventLog.enabled', 'true'), ('spark.network.crypto.enabled', 'true'), ('spark.sql.hive.hwc.execution.mode', 'spark'), ('spark.kubernetes.driver.pod.name', 'ikgvegy1ovk9i3l7'), ('spark.kubernetes.namespace', 'mlx-user-9'), ('spark.yarn.access.hadoopFileSystems', 's3a://goes-se-sandbox01/warehouse/tablespace/external/hive'), ('spark.kerberos.renewal.credentials', 'ccache'), ('spark.sql.catalog.spark_catalog',...