skaffold
skaffold copied to clipboard
Kubectl manifest apply step changes the image unexpectedly
Expected behavior
when deploy test/test.yaml via kubectl the image name gets changed: Container image valorized as docker.io/postman/newman ( or simply postman/newman)
Actual behavior
image: localhost:5000/docker_io_postman_newman
Information
- Skaffold version:
skaffold version
v1.39.1
- Operating system:
PopOs ( Ubuntu) 21.10
- Installed via:
Homebrew
- Contents of skaffold.yaml:
---
apiVersion: skaffold/v2beta29
kind: Config
metadata:
name: node-persistence-tests
requires:
- configs: ["node-persistence"]
profiles:
- name: kind-keyless
activation:
- kubeContext: kind-keyless
deploy:
kubectl:
manifests:
- test/test.yaml
hooks:
before:
- host:
command:
- bash
- -c
- cat charts/node-persistence/collection*.json | sed -s 's#{{`{{#{{#g' | sed -s 's#}}`}}#}}#g' | sed -s 's#http://{{ include "node-persistence.fullname" . }}.{{ include "node-persistence.namespace" . }}.svc.cluster.local:8080#https://node-persistence.node-persistence.core-test.keyless.technology#g' > test/collection.json
- host:
command:
- bash
- -c
- kubectl delete configmap latest-postman-test || echo configmap already deleted
- host:
command:
- bash
- -c
- kubectl create configmap latest-postman-test --from-file=test/collection.json || echo configmap already exists
after:
- host:
command:
- bash
- -c
- kubectl wait --for=condition=complete --timeout=1m job/test-latest
apiVersion: batch/v1
kind: Job
metadata:
name: test-latest
spec:
template:
spec:
containers:
- name: postman
image: docker.io/postman/newman
command:
- newman
- run
- --insecure
- /test/collection.json
volumeMounts:
- name: latest-postman-test
mountPath: "/test"
readOnly: true
imagePullSecrets:
- name: registry
volumes:
- name: latest-postman-test
configMap:
name: latest-postman-test
restartPolicy: Never
Steps to reproduce the behavior
At this moment in time I have no idea if this behaviour is specific to my case so I'll generalize the "how to reproduce"
- take any yaml and deploy it with deploy.kubectl.manifests
- check the image name
Debug output
DEBU[0056] Running command: [kubectl --context kind-keyless create --dry-run=client -oyaml -f /home/emanuele/Documents/bin/kl_helm/node-persistence/test/test.yaml] subtask=3 task=Deploy
DEBU[0056] Command output: [apiVersion: batch/v1
kind: Job
metadata:
name: test-latest
namespace: default
spec:
template:
spec:
containers:
- command:
- newman
- run
- --insecure
- /test/collection.json
image: docker.io/postman/newman
name: postman
volumeMounts:
- mountPath: /test
name: latest-postman-test
readOnly: true
imagePullSecrets:
- name: registry
restartPolicy: Never
volumes:
- configMap:
name: latest-postman-test
name: latest-postman-test
] subtask=3 task=Deploy
DEBU[0056] manifests with tagged images:apiVersion: batch/v1
kind: Job
metadata:
name: test-latest
namespace: default
spec:
template:
spec:
containers:
- command:
- newman
- run
- --insecure
- /test/collection.json
image: localhost:5000/docker_io_postman_newman
name: postman
volumeMounts:
- mountPath: /test
name: latest-postman-test
readOnly: true
imagePullSecrets:
- name: registry
restartPolicy: Never
volumes:
- configMap:
name: latest-postman-test
name: latest-postman-test subtask=3 task=Deploy
DEBU[0056] manifests with labels apiVersion: batch/v1
kind: Job
metadata:
labels:
skaffold.dev/run-id: 90cadf9c-a88b-4fd7-9b21-bbfe393f185e
name: test-latest
namespace: default
spec:
template:
spec:
containers:
- command:
- newman
- run
- --insecure
- /test/collection.json
image: localhost:5000/docker_io_postman_newman
name: postman
volumeMounts:
- mountPath: /test
name: latest-postman-test
readOnly: true
imagePullSecrets:
- name: registry
restartPolicy: Never
volumes:
- configMap:
name: latest-postman-test
name: latest-postman-test subtask=-1 task=DevLoop
Little update:
if I run skaffold dev with option --default-repo=docker.io
then the image changes slightly as follow
image: docker.io/postman_newman
which still doesn't solve the issue
Thanks for the issue @emanuele-leopardi.
Looks like this is a bug. We should not be transforming a Job kind manifest.
@emanuele-leopardi thanks for flagging this, unfortunately the team hasn't had time to investigate this. If you could make a sample repository that reproduces this that the team could pull down we could try to prioritize fixing this or if you would like to attempt a PR to fix we would be happy to review. We will update here once this is prioritized and moves out of our backlog into a milestone/sprint but currently are not able to for the next milestone
I thought this was already part of GA 2.0.0 😅. Unfortunately we've dropped the use of Skaffold, so I don't have the code anymore. I'll see if I can recover that and put together an example.
/triage-action
Hi @emanuele-leopardi, were you able to recover the example with the issue? I was trying to reproduce this from my side and I get the job image with no modification:
DEBU[0001] Running command: [kubectl --context kind-keyless create --dry-run=client -oyaml -f /project/test/test.yaml] subtask=0 task=Deploy
DEBU[0001] Command output: [apiVersion: batch/v1
kind: Job
metadata:
name: test-latest
namespace: default
spec:
template:
spec:
containers:
- image: docker.io/postman/newman
name: postman
restartPolicy: Never
] subtask=0 task=Deploy
DEBU[0001] manifests with tagged images:apiVersion: batch/v1
kind: Job
metadata:
name: test-latest
namespace: default
spec:
template:
spec:
containers:
- image: docker.io/postman/newman
name: postman
restartPolicy: Never subtask=0 task=Deploy
DEBU[0001] manifests with labels apiVersion: batch/v1
kind: Job
metadata:
labels:
skaffold.dev/run-id: 80ad6c23-cb20-4335-82b3-4b22915cf6cf
name: test-latest
namespace: default
spec:
template:
spec:
containers:
- image: docker.io/postman/newman
name: postman
restartPolicy: Never
This is using skaffold v1.39.1
, and the same for skaffold > v2.0.0
Thanks!
@renzodavid9 no sorry I didn't find the example anymore. Sorry, if you wish we can close this issue.
Hey @emanuele-leopardi, got it. I'll close this then. Thanks!