spark-operator
spark-operator copied to clipboard
Missing docker pull gcr.io/spark-operator/spark:v3.1.1 image
To move forward with a legacy application, I need to pull gcr.io/spark-operator/spark:v3.1.1 image
It seems that image does not exist anymore
❯ docker pull gcr.io/spark-operator/spark:v3.1.1
Maybe somebody removed it, because of:
https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/issues/1800
what would be valid alternatives?
Or is there some way to build it myself?
The Dockerfile in root already requires that image
❯ docker build .
[+] Building 0.5s (4/4) FINISHED docker:desktop-linux
=> [internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 1.49kB 0.0s
=> [internal] load .dockerignore 0.0s
=> => transferring context: 47B 0.0s
=> ERROR [internal] load metadata for gcr.io/spark-operator/spark:v3.1.1 0.5s
=> CANCELED [internal] load metadata for docker.io/library/golang:1.19.2-alpine 0.5s
------
> [internal] load metadata for gcr.io/spark-operator/spark:v3.1.1:
------
Dockerfile:37
--------------------
35 | RUN CGO_ENABLED=0 GOOS=linux GOARCH=amd64 GO111MODULE=on go build -a -o /usr/bin/spark-operator main.go
36 |
37 | >>> FROM ${SPARK_IMAGE}
38 | USER root
39 | COPY --from=builder /usr/bin/spark-operator /usr/bin/
--------------------
ERROR: failed to solve: gcr.io/spark-operator/spark:v3.1.1: gcr.io/spark-operator/spark:v3.1.1: not found
Also encountering the same issue on both my k8s cluster:
Warning Failed 22m (x4 over 23m) kubelet Failed to pull image "gcr.io/spark-operator/spark:v3.1.1": rpc error: code = Unknown desc = Error response from daemon: manifest for gcr.io/spar
k-operator/spark:v3.1.1 not found: manifest unknown: Failed to fetch "v3.1.1" from request "/v2/spark-operator/spark/manifests/v3.1.1".
and by using docker pull gcr.io/spark-operator/spark:v3.1.1:
Error response from daemon: manifest for gcr.io/spark-operator/spark:v3.1.1 not found: manifest unknown: Failed to fetch "v3.1.1" from request "/v2/spark-operator/spark/manifests/v3.1.1".
Seems like the image is no longer available?
Checking https://console.cloud.google.com/gcr/images/spark-operator/GLOBAL/spark and no images exist.
Also encountering the same issue on both my k8s cluster: and by using
docker pull gcr.io/spark-operator/spark:v3.1.1:
If is is possible, use docker pull apache/spark:v3.1.3 instead
Once we found out about the missing docker image, we've been using images generated from the https://github.com/apache/spark-docker repository without any major problems.
Once we found out about the missing docker image, we've been using images generated from the https://github.com/apache/spark-docker repository without any major problems.
There is no 3.1.1? Which is the version recommended
Is there any spark-py image which works with python2.7 for time being i migrate my code to python3 ?
gcr.io/spark-operator/spark:v3.0.0 was working fine for me.
Hi @jalvarez I am trying to setup the Spark on K8s using spark-on-k8s-operator, if we build the Apache spark docker image from https://github.com/apache/spark-docker repository then should we change mainApplicationFile parameter in one of the examples code?
https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/master/examples/spark-pi.yaml#L27
Hi @jalvarez I am trying to setup the Spark on K8s using spark-on-k8s-operator, if we build the Apache spark docker image from https://github.com/apache/spark-docker repository then should we change mainApplicationFile parameter in one of the examples code?
https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/master/examples/spark-pi.yaml#L27
If you build spark image from version 3.1.1, then you needn't change mainApplicationFile parameter.
Currently all the examples in https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/tree/master/examples appear to be referencing this non existing image. This implies the quick start guide is also broken.
Is there any properly working repository anywhere?
Hopefully someone will merge the PR or correct all examples + Dockerfile.
Hopefully someone will merge the PR or correct all examples + Dockerfile.
Probably not, given that this repo may soon be transferred to a new owner (the Kubeflow Org)
I am referencing the tutorial and can confirm that it complains about the missing image.
https://github.com/kubeflow/spark-operator/blob/master/examples/spark-pi.yaml
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.