Joost Döbken

Results 14 comments of Joost Döbken

It seams like this issue is fixed. when I reproduce the steps by @PoslavskySV : ```cmd $ jupyter kernelspec list Available kernels: scala212 /home/jovyan/.local/share/jupyter/kernels/scala212 scala213 /home/jovyan/.local/share/jupyter/kernels/scala213 scala32 /home/jovyan/.local/share/jupyter/kernels/scala32 python3 /opt/conda/share/jupyter/kernels/python3...

I experience the same: ```cmd $ helm repo add spark-operator https://googlecloudplatform.github.io/spark-on-k8s-operator Error: looks like "https://googlecloudplatform.github.io/spark-on-k8s-operator" is not a valid chart repository or cannot be reached: failed to fetch https://googlecloudplatform.github.io/spark-on-k8s-operator/index.yaml :...

This works: ```cmd helm repo add spark-operator https://kubeflow.github.io/spark-operator/ ``` from https://github.com/kubeflow/spark-operator/issues/1940:

I got this working quit simple. Following this explainer about running Spark in client mode: https://medium.com/@sephinreji98/understanding-spark-cluster-modes-client-vs-cluster-vs-local-d3c41ea96073 **deploy Jupyter Spark manifest** Include an headless service to run in `client mode` and...