Faulty Kustomize version on kubectl version output
What happened:
Non-existent Kustomize version is shown when kubectl version is used
/ # kubectl version
Client Version: v1.28.2
Kustomize Version: v5.0.4-0.20230601165947-6ce0bf390ce3
What you expected to happen:
An existent Kustomize version to be displayed
How to reproduce it (as minimally and precisely as possible):
Install kubectl v1.28.2
run kubectl version
Anything else we need to know?: running on alpine
Environment:
- Kubernetes client and server versions (use
kubectl version): v1.28.2, v1.25 respectively - OS (e.g:
cat /etc/os-release): alpine linux
This issue is currently awaiting triage.
SIG CLI takes a lead on issue triage for this repo, but any Kubernetes member can accept issues by applying the triage/accepted label.
The triage/accepted label can be added by org members by writing /triage accepted in a comment.
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.
Also when kustomization has managedByLabel (in buildMetadata) it writes label app.kubernetes.io/managed-by: kustomize-(devel) which is invalid for k8s resources.
Also when
kustomizationhasmanagedByLabel(inbuildMetadata) it writes labelapp.kubernetes.io/managed-by: kustomize-(devel)which is invalid for k8s resources.
I ran into this issue using docker image bitnami/kubectl, which failed my kustomize-build with this error:
Error from server (Invalid): error when creating "STDIN": ConfigMap "<redacted>" is invalid: metadata.labels: Invalid value: "kustomize-(devel)": a valid label must be an empty string or consist of alphanumeric characters, '-', '_' or '.', and must start and end with an alphanumeric character (e.g. 'MyValue', or 'my_value', or '12345', regex used for validation is '(([A-Za-z0-9][-A-Za-z0-9_.]*)?[A-Za-z0-9])?')
I worked around it by downgrading to bitnami/kubectl:1.25.15.
The Kubernetes project currently lacks enough contributors to adequately respond to all issues.
This bot triages un-triaged issues according to the following rules:
- After 90d of inactivity,
lifecycle/staleis applied - After 30d of inactivity since
lifecycle/stalewas applied,lifecycle/rottenis applied - After 30d of inactivity since
lifecycle/rottenwas applied, the issue is closed
You can:
- Mark this issue as fresh with
/remove-lifecycle stale - Close this issue with
/close - Offer to help out with Issue Triage
Please send feedback to sig-contributor-experience at kubernetes/community.
/lifecycle stale
The Kubernetes project currently lacks enough active contributors to adequately respond to all issues.
This bot triages un-triaged issues according to the following rules:
- After 90d of inactivity,
lifecycle/staleis applied - After 30d of inactivity since
lifecycle/stalewas applied,lifecycle/rottenis applied - After 30d of inactivity since
lifecycle/rottenwas applied, the issue is closed
You can:
- Mark this issue as fresh with
/remove-lifecycle rotten - Close this issue with
/close - Offer to help out with Issue Triage
Please send feedback to sig-contributor-experience at kubernetes/community.
/lifecycle rotten