skaffold
skaffold copied to clipboard
exit with "missing Resource metadata"
Expected behavior
skaffold dev running properly
Actual behavior
exit with "missing Resource metadata", no resource except images were created
Information
- Skaffold version: v2.7.1
- Operating system: Windows 11 pro 22H2 22621.2283
- Installed via: Scoop
- Contents of skaffold.yaml:
apiVersion: skaffold/v4beta5
kind: Config
metadata:
name: admin-service
build:
artifacts:
- image: piomin/admin
jib: {}
tagPolicy:
gitCommit: {}
manifests:
rawYaml:
- ../k8s/privileges.yaml
- k8s/**.yaml
Steps to reproduce the behavior
1.https://github.com/piomin/sample-spring-microservices-kubernetes.git
2. skaffold dev
3. ...
❯ skaffold dev -vdebug
time="2023-09-23T12:18:52+08:00" level=debug msg="skaffold API not starting as it's not requested" subtask=-1 task=DevLoop
time="2023-09-23T12:18:52+08:00" level=info msg="Skaffold &{Version:v2.7.1 ConfigVersion:skaffold/v4beta6 GitVersion: GitCommit:
4557ab1d4c8361977bbade432169f0aa048f4310 BuildDate:2023-09-13T14:49:46Z GoVersion:go1.21.0 Compiler:gc Platform:windows/amd64 User:}" subtask=-1 task=DevLoop
time="2023-09-23T12:18:52+08:00" level=info msg="Loaded Skaffold defaults from \"C:\\\\Users\\\\fake\\\\.skaffold\\\\config\"" subtask=-1 task=DevLoop
time="2023-09-23T12:18:52+08:00" level=debug msg="config version out of date: upgrading to latest \"skaffold/v4beta6\"" subtask=-1 task=DevLoop
time="2023-09-23T12:18:52+08:00" level=debug msg="parsed 1 configs from configuration file D:\\dev\\opensource\\sample-spring-microservices-kubernetes\\admin-service\\skaffold.yaml" subtask=-1 task=DevLoop
time="2023-09-23T12:18:52+08:00" level=debug msg="Defaulting build type to local build" subtask=-1 task=DevLoop
time="2023-09-23T12:18:52+08:00" level=debug msg="Found raw k8s manifests without cloud run deploy, adding kubectl deployer" subtask=-1 task=DevLoop
time="2023-09-23T12:18:52+08:00" level=info msg="map entry found when executing locate for &{piomin/admin . <nil> {<nil> <nil> <nil> 0xc000a40370 <nil> <nil> <nil>} [] {[] []} [] } of type *latest.Artifact and pointer: 824642536880" subtask=-1 task=DevLoop
time="2023-09-23T12:18:52+08:00" level=info msg="Using kubectl context: docker-desktop" subtask=-1 task=DevLoop
time="2023-09-23T12:18:52+08:00" level=debug msg="getting client config for kubeContext: `docker-desktop`" subtask=-1 task=DevLoop
time="2023-09-23T12:18:52+08:00" level=debug msg="getting client config for kubeContext: `docker-desktop`" subtask=-1 task=DevLoop
time="2023-09-23T12:18:52+08:00" level=debug msg="Running command: [minikube version --output=json]" subtask=-1 task=DevLoop
time="2023-09-23T12:18:52+08:00" level=debug msg="setting Docker user agent to skaffold-v2.7.1" subtask=-1 task=DevLoop
time="2023-09-23T12:18:52+08:00" level=info msg="no kpt renderer or deployer found, skipping hydrated-dir creation" subtask=-1 task=DevLoop
time="2023-09-23T12:18:52+08:00" level=debug msg="Running command: [kubectl config view --minify -o jsonpath='{..namespace}']" subtask=-1 task=DevLoop
time="2023-09-23T12:18:53+08:00" level=debug msg="Command output: ['']" subtask=-1 task=DevLoop
time="2023-09-23T12:18:53+08:00" level=debug msg="CLI platforms provided: \"\"" subtask=-1 task=DevLoop
time="2023-09-23T12:18:53+08:00" level=debug msg="getting client config for kubeContext: `docker-desktop`" subtask=-1 task=DevLoop
time="2023-09-23T12:18:53+08:00" level=debug msg="platforms detected from active kubernetes cluster nodes: \"linux/amd64\"" subtask=-1 task=DevLoop
time="2023-09-23T12:18:53+08:00" level=debug msg="platforms selected for artifact \"piomin/admin\": \"linux/amd64\"" subtask=-1 task=DevLoop
time="2023-09-23T12:18:53+08:00" level=debug msg="Using builder: local" subtask=-1 task=DevLoop
time="2023-09-23T12:18:53+08:00" level=debug msg="push value not present in NewBuilder, defaulting to false because cluster.PushImages is false" subtask=-1 task=DevLoop
time="2023-09-23T12:18:53+08:00" level=info msg="build concurrency first set to 1 parsed from *local.Builder[0]" subtask=-1 task=DevLoop
time="2023-09-23T12:18:53+08:00" level=info msg="final build concurrency value is 1" subtask=-1 task=DevLoop
Generating tags...
- piomin/admin -> time="2023-09-23T12:18:53+08:00" level=debug msg="config version out of date: upgrading to latest \"skaffold/v4beta6\"" subtask=-1 task=DevLoop
time="2023-09-23T12:18:53+08:00" level=debug msg="config version out of date: upgrading to latest \"skaffold/v4beta6\"" subtask=-1 task=DevLoop
time="2023-09-23T12:18:53+08:00" level=debug msg="Running command: [git describe --tags --always]" subtask=-1 task=Build
time="2023-09-23T12:18:53+08:00" level=debug msg="Command output: [8150a9a\n]" subtask=-1 task=Build
time="2023-09-23T12:18:53+08:00" level=debug msg="Running command: [git status . --porcelain]" subtask=-1 task=Build
time="2023-09-23T12:18:53+08:00" level=debug msg="Command output: []" subtask=-1 task=Build
piomin/admin:8150a9a
time="2023-09-23T12:18:53+08:00" level=info msg="Tags generated in 176.932ms" subtask=-1 task=Build
Checking cache...
time="2023-09-23T12:18:53+08:00" level=debug msg="Running command: [java -version]" subtask=-1 task=Build
time="2023-09-23T12:18:53+08:00" level=debug msg="Running command: [mvn jib:_skaffold-fail-if-jib-out-of-date -Djib.requiredVersion=1.4.0 --non-recursive jib:_skaffold-files-v2 --quiet --batch-mode]" subtask=-1 task=Build
time="2023-09-23T12:18:56+08:00" level=debug msg="Command output: [\r\nBEGIN JIB JSON\r\n{\"build\":[\"D:\\\\dev\\\\opensource\\
\\sample-spring-microservices-kubernetes\\\\admin-service\\\\pom.xml\"],\"inputs\":[\"D:\\\\dev\\\\opensource\\\\sample-spring-m
icroservices-kubernetes\\\\admin-service\\\\src\\\\main\\\\java\",\"D:\\\\dev\\\\opensource\\\\sample-spring-microservices-kuber
netes\\\\admin-service\\\\src\\\\main\\\\resources\",\"D:\\\\dev\\\\opensource\\\\sample-spring-microservices-kubernetes\\\\admi
n-service\\\\src\\\\main\\\\resources\",\"D:\\\\dev\\\\opensource\\\\sample-spring-microservices-kubernetes\\\\admin-service\\\\
src\\\\main\\\\jib\"],\"ignore\":[]}\r\n], stderr: Picked up JAVA_TOOL_OPTIONS: -Duser.language=en -Dfile.encoding=UTF8\n" subtask=-1 task=Build
time="2023-09-23T12:18:56+08:00" level=debug msg="could not stat dependency: CreateFile D:\\\\dev\\\\opensource\\\\sample-spring
-microservices-kubernetes\\\\admin-service\\\\src\\\\main\\\\jib: The system cannot find the file specified." subtask=-1 task=DevLoop
time="2023-09-23T12:18:56+08:00" level=debug msg="Found dependencies for jib maven artifact: [pom.xml src\\main\\java\\pl\\piomi
n\\services\\admin\\AdminApplication.java src\\main\\java\\pl\\piomin\\services\\admin\\config\\SecurityConfiguration.java src\\main\\resources\\bootstrap.yml src\\main\\resources\\bootstrap.yml]" subtask=-1 task=Build
time="2023-09-23T12:18:56+08:00" level=debug msg="push value not present in isImageLocal(), defaulting to false because cluster.PushImages is false" subtask=-1 task=DevLoop
- piomin/admin: time="2023-09-23T12:18:56+08:00" level=debug msg="push value not present in isImageLocal(), defaulting to false because cluster.PushImages is false" subtask=-1 task=DevLoop
Found Locally
time="2023-09-23T12:18:56+08:00" level=debug msg="push value not present in isImageLocal(), defaulting to false because cluster.PushImages is false" subtask=-1 task=DevLoop
time="2023-09-23T12:18:56+08:00" level=info msg="Cache check completed in 2.895 seconds" subtask=-1 task=Build
time="2023-09-23T12:18:56+08:00" level=info msg="Starting render..." subtask=-1 task=DevLoop
time="2023-09-23T12:18:56+08:00" level=info msg="starting render process" subtask=0 task=Render
time="2023-09-23T12:18:56+08:00" level=debug msg="Executing template &{envTemplate 0xc000175560 0xc0003f45a0 } with environment
map[:::=::\\ ALLUSERSPROFILE:....]" subtask=-1 task=DevLoop
time="2023-09-23T12:18:56+08:00" level=debug msg="Executing template &{envTemplate 0xc0001aed80 0xc0003f45f0 } with environment
map[:::=::\\ ALLUSERSPROFILE:...]" subtask=-1 task=DevLoop
Cleaning up...
time="2023-09-23T12:18:56+08:00" level=debug msg="Running command: [kubectl --context docker-desktop delete --ignore-not-found=true --wait=false -f -]" subtask=-1 task=DevLoop
- No resources found
time="2023-09-23T12:18:56+08:00" level=info msg="Cleanup completed in 174.9103ms" subtask=-1 task=DevLoop
missing Resource metadata
time="2023-09-23T12:18:56+08:00" level=debug msg="exporting metrics disabled" subtask=-1 task=DevLoop
Ah I get the same issue too, when using chartPath locally it errors
Got the same error, any workaround?
Happening to me, too. It was working just fine, then out of nowhere this has been stopping me in my tracks. Full removal and reinstall didn't fix
I hit this issue and it was due to changes in a local helm chart being deployed by skaffold (using manifests.helm.releases in skaffold spec).
I started undoing changes until error went away, it seems to have been caused by me commenting out a cookie cutter PVC helm template. (Adding # to every line).
I found that if I remove the labels lines, I do not get the error. With it present, I get the same missing Resource metadata. It appears some level of templating is happening with these comments? Perhaps the yaml comments are not respected properly?
Reproduces error:
# kind: PersistentVolumeClaim
# apiVersion: v1
# metadata:
# name: {{ include "test.fullname" . }}-test
# labels:
# {{- include "test.labels" . | nindent 4 }}
# spec:
# accessModes:
# {{- range .Values.storage.accessModes }}
# - {{ . | quote }}
# {{- end }}
# resources:
# requests:
# storage: {{ .Values.storage.size | quote }}
# {{- with .Values.storage.storageClassName }}
# storageClassName: {{ . | quote }}
# {{- end }}
Does not reproduce error / works as expected. (Label include lines removed)
# kind: PersistentVolumeClaim
# apiVersion: v1
# metadata:
# name: {{ include "test.fullname" . }}-test
# spec:
# accessModes:
# {{- range .Values.storage.accessModes }}
# - {{ . | quote }}
# {{- end }}
# resources:
# requests:
# storage: {{ .Values.storage.size | quote }}
# {{- with .Values.storage.storageClassName }}
# storageClassName: {{ . | quote }}
# {{- end }}
Hello, just bumping it up. It's 1 am at night and I just arrived here. Have the same error with the same root case.
In my case the issue was that my manifests were using CRLF (Windows line break). By switching to LF line break the issue was resolved.
I was facing a similar a issue. I was trying to isolate each config separately and realised the issue might be happening in Windows for "ClusterRoleBinding" manifests (for ClusterRole roleRef). When I comment that part of the yaml, everything gets loaded as usual. Could this be an issue?
@c3c @prestonyun @userbradley
have you found a solution?
I tried many times and still the same, the problem is exactly the same as in the first post and I am able to reproduce it.
I received the same error. I was deploying postgresql , and I used configMap for defining the db name and db host, but when i removed that code, and directly used the value, It worked like charms! This was for local setup.
Old Code:
#Removed configMap and its referred value
apiVersion: v1
kind: ConfigMap
metadata:
name: task-postgres-configmap
data:
task-db-host: my_db_host_value
task-db-name: my_db_name
---
apiVersion: apps/v1
kind: Deployment
metadata:
name: task-postgres-depl
spec:
replicas: 1
selector:
matchLabels:
app: task-postgres
template:
metadata:
labels:
app: task-postgres
spec:
containers:
- name: task-postgres
image: postgres:16-alpine
ports:
- containerPort: 5432
env:
- name: DB_USER
valueFrom:
secretKeyRef:
name: task-db-user-secret
key: DB_USER
- name: DB_PASSWORD
valueFrom:
secretKeyRef:
name: task-db-password-secret
key: DB_PASSWORD
- name: DB_HOST
#Removed
valueFrom:
configMapKeyRef:
name: task-postgres-configmap
key: task-db-host
- name: DB_NAME
#Removed
valueFrom:
configMapKeyRef:
name: task-postgres-configmap
key: task-db-name
---
apiVersion: v1
kind: Service
metadata:
name: task-postgres-srv
spec:
selector:
app: task-postgres
type: ClusterIP
ports:
- protocol: TCP
port: 5432
targetPort: 5432
Updated Code:
apiVersion: apps/v1
kind: Deployment
metadata:
name: task-postgres-depl
spec:
replicas: 1
selector:
matchLabels:
app: task-postgres
template:
metadata:
labels:
app: task-postgres
spec:
containers:
- name: task-postgres
image: postgres:16-alpine
ports:
- containerPort: 5432
env:
- name: POSTGRES_USER
valueFrom:
secretKeyRef:
name: task-db-user-secret
key: DB_USER
- name: POSTGRES_PASSWORD
valueFrom:
secretKeyRef:
name: task-db-password-secret
key: DB_PASSWORD
- name: POSTGRES_HOST
value: my_db_host_value
- name: POSTGRES_DB
value: my_db_name_value
---
apiVersion: v1
kind: Service
metadata:
name: task-postgres-srv
spec:
selector:
app: task-postgres
type: ClusterIP
ports:
- protocol: TCP
port: 5432
targetPort: 5432
Hi does anyone have a solution for this? I'm currently using v4beta11, it worked properly some weeks ago but it suddenly returned the following message and exit
Cleaning up...
- No resources found Pruning images... missing Resource metadata
Here is my skaffold.yaml
apiVersion: skaffold/v4beta11
kind: Config
metadata:
name: skaffold
manifests:
rawYaml:
- ./infra/k8s/auth-depl.yaml
build:
local:
push: false
artifacts:
- image: meccar/auth
context: auth
docker:
dockerfile: Dockerfile
sync:
manual:
- src: "src/**/*.js"
dest: .
Here is the auth-depl.yaml
apiVersion: apps/v1
kind: Deployment
metadata:
name: auth-depl
labels:
app: auth
spec:
replicas: 1
selector:
matchLabels:
app: auth
template:
metadata:
labels:
app: auth
spec:
containers:
- name: auth
image: meccar/auth:latest
---
apiVersion: v1
kind: Service
metadata:
name: auth-clusterip-srv
spec:
selector:
app: auth
type: ClusterIP
ports:
- name: auth
protocol: TCP
port: 8001
targetPort: 8001
I tried to reinstall skaffold using choco but the error still remains
My OS is Windows 11 Pro Minikube version: v1.33.1 Skaffold version: v2.12.0 Kustomize Version: v5.0.4-0.20230601165947-6ce0bf390ce3 Docker version: 26.1.4