Trivy DB Download from Private Registry errors with Authentication Required
What steps did you take and what happened:
I was installing trivy-operator on Kubernetes, and using Artifactory as a private registry.
I have tried all of second, third and fourth options from https://aquasecurity.github.io/trivy-operator/v0.5.0/tutorials/private-registries/ to configure an image pull secret for this. The other images download all right. But it is just the trivy database download which fails with the below error
init error: DB error: failed to download vulnerability DB: database download error: OCI repository error: 1 error occurred:
* GET https://artifactory.lab.local/artifactory/api/docker/null/v2/token?scope=repository%3Ak8s%2Faquasecurity%2Ftrivy-db%3Apull&service=artifactory.lab.local: : Authentication is required
What did you expect to happen: I would expect this to be able to authenticate as the image pull secret is already provided.
Anything else you would like to add: I have gone through a number of past/closed issues on trivy for similar kind of occurrences, but none have helped so far.
Environment:
- Trivy-Operator version (use
trivy-operator version): v0.18.5 - Kubernetes version (use
kubectl version): 1.25.15 - OS (macOS 10.15, Windows 10, Ubuntu 19.10 etc): RHEL 8
Adding some more context here for the benefit of the maintainers.
We dealt with an initial obstacle while hosting the TrivyDB outside Kubernetes using the Trivy CLI server mode. We facilitated the operator's communication with it by integrating the following line in the trivy-operator-trivy-config configMap:
data:
trivy.serverURL: <serverURL>
But still the issue of not being able to reach artifactory persists. The scan-vulnerabilityreport pods error out with:
2024-04-05T05:50:00.642Z FATAL image scan error: scan error: unable to initialize a scanner: unable to initialize a remote image scanner: 4 errors occurred:
* docker error: unable to inspect the image (artifactory.deshaw.com/******): Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?
* containerd error: containerd socket not found: /run/containerd/containerd.sock
* podman error: unable to initialize Podman client: no podman socket found: stat podman/podman.sock: no such file or directory
* remote error: Get "https://artifactory.deshaw.com/v2/": tls: failed to verify certificate: x509: certificate signed by unknown authority
We attempted to provide the imagePullSecret via different means:
- Mentioning in a service account (as suggested in the documentation)
- Updating the operator deployment to add the secret name under Pod specification:
imagePullSecrets:
- name: artifactory-login-cfg
- Environment variables: we also edited the
trivy-operator-trivy-config, addingtrivy.imagePullSecret: artifactory-login-cfg.
But to no avail.
We tried to skip the TLS verification passing the TRIVY_INSECURE: true flag in the trivy-operator-config configMap to be used as a env var in the deployment but the issue persists.
It is also worth mentioning that in the pod/trivy-operator logs we see this while it is coming up:
W0415 12:01:06.089154 1 reflector.go:539] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:229: failed to list *v1alpha1.ClusterSbomReport: json: cannot unmarshal array into Go struct field Metadata.items.report.components.metadata.tools of type v1alpha1.Tools
E0415 12:01:06.089232 1 reflector.go:147] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:229: Failed to watch *v1alpha1.ClusterSbomReport: failed to list *v1alpha1.ClusterSbomReport: json: cannot unmarshal array into Go struct field Metadata.items.report.components.metadata.tools of type v1alpha1.Tools
{"level":"info","ts":"2024-04-15T12:01:06Z","msg":"Starting workers","controller":"vulnerabilityreport","controllerGroup":"aquasecurity.github.io","controllerKind":"VulnerabilityReport","worker count":1}
W0415 12:01:07.094273 1 reflector.go:539] pkg/mod/k8s.io/[email protected]/tools/cache/reflector.go:229: failed to list *v1alpha1.ClusterSbomReport: json: cannot unmarshal array into Go struct field Metadata.items.report.components.metadata.tools of type v1alpha1.Tools
and this by the end when it crashes and restarts:
{"level":"error","ts":"2024-04-16T10:20:47Z","msg":"Reconciler error","controller":"resourcequota","controllerGroup":"","controllerKind":"ResourceQuota","ResourceQuota":{"name":"namespace-manager-resource-quota","namespace":"srivahim"},"namespace":"srivahim","name":"namespace-manager-resource-quota","reconcileID":"f6f68afb-73d7-416a-859c-680e1750e83b","error":"evaluating resource: no policies found for kind: ResourceQuota","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:329\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:266\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:227"}
{"level":"error","ts":"2024-04-16T10:20:47Z","msg":"Reconciler error","controller":"resourcequota","controllerGroup":"","controllerKind":"ResourceQuota","ResourceQuota":{"name":"namespace-manager-resource-quota","namespace":"sys-benchmarks"},"namespace":"sys-benchmarks","name":"namespace-manager-resource-quota","reconcileID":"18033501-9e59-4331-a416-f889e133fe43","error":"evaluating resource: no policies found for kind: ResourceQuota","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:329\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:266\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:227"}
{"level":"error","ts":"2024-04-16T10:20:47Z","msg":"Reconciler error","controller":"job","controllerGroup":"batch","controllerKind":"Job","Job":{"name":"irdocgen-dev-dbmigrate","namespace":"ir-docgen-dev"},"namespace":"ir-docgen-dev","name":"irdocgen-dev-dbmigrate","reconcileID":"f6a7020b-e84e-4fb0-b5fd-a307247aeb08","error":"evaluating resource: no policies found for kind: Job","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:329\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:266\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:227"}
{"level":"error","ts":"2024-04-16T10:20:47Z","msg":"Reconciler error","controller":"resourcequota","controllerGroup":"","controllerKind":"ResourceQuota","ResourceQuota":{"name":"namespace-manager-resource-quota","namespace":"sys-status-page-qa"},"namespace":"sys-status-page-qa","name":"namespace-manager-resource-quota","reconcileID":"f7e31c62-95b7-47c9-91bb-e7ce246431ca","error":"evaluating resource: no policies found for kind: ResourceQuota","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:329\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:266\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:227"}
@chheda-deshaw please uninstall operator delete all crds :
kubectl delete crd vulnerabilityreports.aquasecurity.github.io
kubectl delete crd exposedsecretreports.aquasecurity.github.io
kubectl delete crd configauditreports.aquasecurity.github.io
kubectl delete crd clusterconfigauditreports.aquasecurity.github.io
kubectl delete crd rbacassessmentreports.aquasecurity.github.io
kubectl delete crd infraassessmentreports.aquasecurity.github.io
kubectl delete crd clusterrbacassessmentreports.aquasecurity.github.io
kubectl delete crd clustercompliancereports.aquasecurity.github.io
kubectl delete crd clusterinfraassessmentreports.aquasecurity.github.io
kubectl delete crd sbomreports.aquasecurity.github.io
kubectl delete crd clustersbomreports.aquasecurity.github.io
kubectl delete crd clustervulnerabilityreports.aquasecurity.github.io
install latest operator
@phsys please share your config maps
trivy-operator.txt trivy-operator-config.txt trivy-operator-policies-config.txt trivy-operator-trivy-config.txt
As you can see we have made certain alterations to the configmaps to help the operator go through artifactory. These would be needed.
@chheda-deshaw please uninstall operator delete all crds : ... install latest operator
So we could delete the operator and reinstall it but we would need these changes again so wouldn't we be back to square one ?
@chen-keinan We uninstalled the operator and deleted all the crds, followed by installing the latest version (0.19.4) It still errors out with the same:
kubectl logs scan-vulnerabilityreport-9bcb4456c-mq245 -n trivy-system
2024-04-16T13:46:20.230Z FATAL image scan error: scan error: unable to initialize a scanner: unable to initialize a remote image scanner: 6 errors occurred:
* docker error: unable to inspect the image (artifactory.deshaw.com/k8s-qa/cdnqa:59bcf02): Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?
* containerd error: containerd socket not found: /run/containerd/containerd.sock
* podman error: unable to initialize Podman client: no podman socket found: stat podman/podman.sock: no such file or directory
* remote error: Get "https://artifactory.deshaw.com/v2/": tls: failed to verify certificate: x509: certificate signed by unknown authority
* remote error: Get "https://artifactory.deshaw.com/v2/": tls: failed to verify certificate: x509: certificate signed by unknown authority
* remote error: Get "https://artifactory.deshaw.com/v2/": tls: failed to verify certificate: x509: certificate signed by unknown authority
*
@chen-keinan We uninstalled the operator and deleted all the crds, followed by installing the latest version (0.19.4) It still errors out with the same:
kubectl logs scan-vulnerabilityreport-9bcb4456c-mq245 -n trivy-system 2024-04-16T13:46:20.230Z FATAL image scan error: scan error: unable to initialize a scanner: unable to initialize a remote image scanner: 6 errors occurred: * docker error: unable to inspect the image (artifactory.deshaw.com/k8s-qa/cdnqa:59bcf02): Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running? * containerd error: containerd socket not found: /run/containerd/containerd.sock * podman error: unable to initialize Podman client: no podman socket found: stat podman/podman.sock: no such file or directory * remote error: Get "https://artifactory.deshaw.com/v2/": tls: failed to verify certificate: x509: certificate signed by unknown authority * remote error: Get "https://artifactory.deshaw.com/v2/": tls: failed to verify certificate: x509: certificate signed by unknown authority * remote error: Get "https://artifactory.deshaw.com/v2/": tls: failed to verify certificate: x509: certificate signed by unknown authority *
have you setup insecure-registries ? I can't find it in your config
also make sure db-Repository is also set to insecure
@chen-keinan Thanks for the suggestion. I made the above two edits and now from the tls cert issue we are running into 'Authentication is required' issue..
kubectl logs scan-vulnerabilityreport-64f9df68d-rwgnh -n trivy-system
2024-04-16T14:21:27.853Z FATAL image scan error: scan error: unable to initialize a scanner: unable to initialize a remote image scanner: 4 errors occurred:
* docker error: unable to inspect the image (artifactory.deshaw.com/k8s-qa/origprov-k8s:v2): Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?
* containerd error: containerd socket not found: /run/containerd/containerd.sock
* podman error: unable to initialize Podman client: no podman socket found: stat podman/podman.sock: no such file or directory
* remote error: GET https://artifactory.deshaw.com/artifactory/api/docker/null/v2/token?scope=repository%3Ak8s-qa%2Forigprov-k8s%3Apull&service=artifactory.deshaw.com: : Authentication is required
@hore-deshaw are you using image pull secrets in pods or service account? is so, how do you create the secrets ? it should be in for of user/ password , like this:
kubectl create secret docker-registry regcred --docker-server=<your-registry-server> --docker-username=<your-name> --docker-password=<your-pword> --docker-email=<your-email>
@chen-keinan We are using our image pull secrets in pods (and had tried with service accounts also). We already have the secret created with the command you noted above.
@hore-deshaw can please put here example of how to pod/deployment descriptor with image pull secrets?
kubectl get pods <pod name> -o yaml
And also could you please create a secret with fake credential but same registry (same way as you created it) and also put it here, I want to see how it look like
The pod descriptor:
apiVersion: v1
kind: Pod
metadata:
annotations:
cni.projectcalico.org/containerID: 681c9e3a031da616ab3e6f407a3b2e4563fe398815f29770d7f3d5d6f0ae14ac
cni.projectcalico.org/podIP: 192.168.239.147/32
cni.projectcalico.org/podIPs: 192.168.239.147/32
creationTimestamp: "2024-04-16T13:42:36Z"
generateName: trivy-operator-668c66fb6-
labels:
app.kubernetes.io/instance: trivy-operator
app.kubernetes.io/name: trivy-operator
pod-template-hash: 668c66fb6
name: trivy-operator-668c66fb6-f9xxq
namespace: trivy-system
ownerReferences:
- apiVersion: apps/v1
blockOwnerDeletion: true
controller: true
kind: ReplicaSet
name: trivy-operator-668c66fb6
uid: 92b44468-7c74-4677-a60c-c0bb51482cd4
resourceVersion: "638276169"
uid: 5129a832-33cd-4a23-8ef0-bf7e0e2ee868
spec:
automountServiceAccountToken: true
containers:
- env:
- name: OPERATOR_NAMESPACE
value: trivy-system
- name: OPERATOR_TARGET_NAMESPACES
- name: OPERATOR_EXCLUDE_NAMESPACES
- name: OPERATOR_TARGET_WORKLOADS
value: pod,replicaset,replicationcontroller,statefulset,daemonset,cronjob,job
- name: OPERATOR_SERVICE_ACCOUNT
value: trivy-operator
envFrom:
- configMapRef:
name: trivy-operator-config
image: artifactory.deshaw.com/k8s/aquasecurity/trivy-operator:0.19.4
imagePullPolicy: IfNotPresent
livenessProbe:
failureThreshold: 10
httpGet:
path: /healthz/
port: probes
scheme: HTTP
initialDelaySeconds: 5
periodSeconds: 10
successThreshold: 1
timeoutSeconds: 1
name: trivy-operator
ports:
- containerPort: 8080
name: metrics
protocol: TCP
- containerPort: 9090
name: probes
protocol: TCP
readinessProbe:
failureThreshold: 3
httpGet:
path: /readyz/
port: probes
scheme: HTTP
initialDelaySeconds: 5
periodSeconds: 10
successThreshold: 1
timeoutSeconds: 1
resources: {}
securityContext:
allowPrivilegeEscalation: false
capabilities:
drop:
- ALL
privileged: false
readOnlyRootFilesystem: true
terminationMessagePath: /dev/termination-log
terminationMessagePolicy: File
volumeMounts:
- mountPath: /tmp
name: cache-policies
- mountPath: /var/run/secrets/kubernetes.io/serviceaccount
name: kube-api-access-5thmj
readOnly: true
dnsPolicy: ClusterFirst
enableServiceLinks: true
nodeName: hdc1webwrkqa4.k8s.des.co
preemptionPolicy: PreemptLowerPriority
priority: 0
restartPolicy: Always
schedulerName: default-scheduler
securityContext: {}
serviceAccount: trivy-operator
serviceAccountName: trivy-operator
terminationGracePeriodSeconds: 30
tolerations:
- effect: NoExecute
key: node.kubernetes.io/not-ready
operator: Exists
tolerationSeconds: 300
- effect: NoExecute
key: node.kubernetes.io/unreachable
operator: Exists
tolerationSeconds: 300
volumes:
- emptyDir: {}
name: cache-policies
- name: kube-api-access-5thmj
projected:
defaultMode: 420
sources:
- serviceAccountToken:
expirationSeconds: 3607
path: token
- configMap:
items:
- key: ca.crt
path: ca.crt
name: kube-root-ca.crt
- downwardAPI:
items:
- fieldRef:
apiVersion: v1
fieldPath: metadata.namespace
path: namespace
status:
conditions:
- lastProbeTime: null
lastTransitionTime: "2024-04-16T13:42:36Z"
status: "True"
type: Initialized
- lastProbeTime: null
lastTransitionTime: "2024-04-16T13:43:27Z"
status: "True"
type: Ready
- lastProbeTime: null
lastTransitionTime: "2024-04-16T13:43:27Z"
status: "True"
type: ContainersReady
- lastProbeTime: null
lastTransitionTime: "2024-04-16T13:42:36Z"
status: "True"
type: PodScheduled
containerStatuses:
- containerID: cri-o://c2700613cff7a185ffe994f4b96f23a117cf8429fc36d0231d4d3a8b389ef14b
image: artifactory.deshaw.com/k8s/aquasecurity/trivy-operator:0.19.4
imageID: artifactory.deshaw.com/k8s/aquasecurity/trivy-operator@sha256:305ef05858765ecd0ba1a6ad7d2519c878bb0b94152b1fcf8470b2b6df896d46
lastState: {}
name: trivy-operator
ready: true
restartCount: 0
started: true
state:
running:
startedAt: "2024-04-16T13:43:18Z"
hostIP: 10.240.183.134
phase: Running
podIP: 192.168.239.147
podIPs:
- ip: 192.168.239.147
qosClass: BestEffort
startTime: "2024-04-16T13:42:36Z"
The imagePullSecret has been defined in trivy-operator-trivy-config
The secret looks like:
apiVersion: v1
data:
.dockerconfigjson: <secret>
kind: Secret
metadata:
annotations:
kubectl.kubernetes.io/last-applied-configuration: |
{"apiVersion":"v1","data":{".dockerconfigjson":"<secret>"},"kind":"Secret","metadata":{"annotations":{},"name":"artifactory-secret","namespace":"trivy-system"},"type":"kubernetes.io/dockerconfigjson"}
creationTimestamp: "2024-02-26T18:11:43Z"
name: artifactory-secret
namespace: trivy-system
resourceVersion: "553969501"
uid: 7a54369e-326e-4283-90c5-88d42c428909
type: kubernetes.io/dockerconfigjson
@chheda-deshaw I see here two things which I would like to challenge:
- the secret name in global config is:
OPERATOR_PRIVATE_REGISTRY_SCAN_SECRETS_NAMES: '{"trivy-system":"artifactory-login-cfg"}'where the secret you created is named :artifactory-secret - I wanted to see the pod of this image(
artifactory.deshaw.com/k8s-qa/cdnqa:59bcf02) descriptor not trivy operator
Hi @chen-keinan, Apologies for the late reply.
the secret name in global config is: OPERATOR_PRIVATE_REGISTRY_SCAN_SECRETS_NAMES: '{"trivy-system":"artifactory-login-cfg"}' where the secret you created is named : artifactory-secret
artifactory-secret is the dummy secret i created for you. artifactory-login-cfg is what is being used (similar structure to artifactory-secret)
I wanted to see the pod of this image(artifactory.deshaw.com/k8s-qa/cdnqa:59bcf02) descriptor not trivy operator
This is the image that trivy is scanning, it's not a pod but might be one of the images used in a pod. I can't provide that cause it's proprietary information.
@chheda-deshaw I was trying to understand which method of private registry authentication are using in you cluster so understand if something is misconfigure, if you could share that info.
We are using Artifactory as a private registry. Here is an example pod (scoreboard) on the same cluster: I guess this info should be fine right ?
....<Redacted>....
Containers:
scoreboard:
Container ID: cri-o://5d0bbedd0a28fb11ce511ce2b7db835ba02c040173023c1980385e8fe4c0bacd
Image: artifactory.deshaw.com/k8s/scoreboard
Image ID: artifactory.deshaw.com/k8s/scoreboard@sha256:8653a0b99555c244038946af08ffb14e7139e2253931ce15df87359d3276ce3f
Port: 8000/TCP
Host Port: 0/TCP
SeccompProfile: RuntimeDefault
State: Running
Started: Sat, 13 Apr 2024 22:07:20 -0400
Ready: True
Restart Count: 0
Limits:
cpu: 4
ephemeral-storage: 128M
memory: 512M
Requests:
cpu: 25m
ephemeral-storage: 128M
memory: 512M
Environment: <none>
Mounts:
/var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-j92q8 (ro)
Conditions:
Type Status
Initialized True
Ready True
ContainersReady True
PodScheduled True
Volumes:
kube-api-access-j92q8:
Type: Projected (a volume that contains injected data from multiple sources)
TokenExpirationSeconds: 3607
ConfigMapName: kube-root-ca.crt
ConfigMapOptional: <nil>
DownwardAPI: true
QoS Class: Burstable
Node-Selectors: topology.kubernetes.io/region=nyc
Tolerations: node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
Events: <none>
is the pod with artifactory.deshaw.com/k8s/scoreboard image configure with imagePullSecret ?
Yes. Thanks for this. I checked the imagePullSecret of scoreboard pod and found a couple of labels that were missing in the trivy secret. I added:
labels:
app.kubernetes.io/managed-by: k8s-selfserve.deshaw.com
k8s-selfserve.deshaw.com/namespace: trivy-system
to the imagePullSecret.
Now, the vuln scans have begin, but sometimes they still error out with :
2024-04-24T07:11:58.356Z FATAL image scan error: scan error: unable to initialize a scanner: unable to initialize a remote image scanner: 4 errors occurred:
* docker error: unable to inspect the image (artifactory.deshaw.com/k8s/whoami:prod): Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?
* containerd error: containerd socket not found: /run/containerd/containerd.sock
* podman error: unable to initialize Podman client: no podman socket found: stat podman/podman.sock: no such file or directory
* remote error: GET https://artifactory.deshaw.com/artifactory/api/docker/null/v2/token?scope=repository%3Ak8s%2Fwhoami%3Apull&service=artifactory.deshaw.com: : Authentication is required
But they go through sometimes, for eg this is the vuln report from the namespace where trivy-operator was installed:
apiVersion: aquasecurity.github.io/v1alpha1
kind: VulnerabilityReport
metadata:
annotations:
trivy-operator.aquasecurity.github.io/report-ttl: 24h0m0s
creationTimestamp: "2024-04-24T09:27:43Z"
generation: 1
labels:
resource-spec-hash: 688f65596b
trivy-operator.container.name: trivy-operator
trivy-operator.resource.kind: ReplicaSet
trivy-operator.resource.name: trivy-operator-f9cb78c59
trivy-operator.resource.namespace: trivy-system
name: replicaset-trivy-operator-f9cb78c59-trivy-operator
namespace: trivy-system
ownerReferences:
- apiVersion: apps/v1
blockOwnerDeletion: false
controller: true
kind: ReplicaSet
name: trivy-operator-f9cb78c59
uid: 1b3bbc0f-c94e-4d8d-9d36-c86f15e387d9
resourceVersion: "651796189"
uid: 0a4205a2-8f36-441f-8156-c7315eb4ce70
report:
artifact:
digest: sha256:b90be0a9343d81d5fb3a13c736d903e62e9ff030d58adc6ec12d4a7c9cfb49d1
repository: k8s/aquasecurity/trivy-operator
tag: 0.19.4
os:
family: alpine
name: 3.19.1
registry:
server: artifactory.deshaw.com
scanner:
name: Trivy
vendor: Aqua Security
version: 0.50.1
summary:
criticalCount: 1
highCount: 0
lowCount: 2
mediumCount: 2
noneCount: 0
unknownCount: 0
updateTimestamp: "2024-04-24T09:27:43Z"
vulnerabilities:
- fixedVersion: 3.1.4-r6
installedVersion: 3.1.4-r5
lastModifiedDate: "2024-04-08T18:48:40Z"
links: []
primaryLink: https://avd.aquasec.com/nvd/cve-2024-2511
publishedDate: "2024-04-08T14:15:07Z"
resource: libcrypto3
score: 3.7
severity: LOW
target: ""
title: 'openssl: Unbounded memory growth with session handling in TLSv1.3'
vulnerabilityID: CVE-2024-2511
- fixedVersion: 3.1.4-r6
installedVersion: 3.1.4-r5
lastModifiedDate: "2024-04-08T18:48:40Z"
links: []
primaryLink: https://avd.aquasec.com/nvd/cve-2024-2511
publishedDate: "2024-04-08T14:15:07Z"
resource: libssl3
score: 3.7
severity: LOW
target: ""
title: 'openssl: Unbounded memory growth with session handling in TLSv1.3'
vulnerabilityID: CVE-2024-2511
- fixedVersion: 1.7.4
installedVersion: v1.7.3
lastModifiedDate: "2024-04-18T13:04:28Z"
links: []
primaryLink: https://avd.aquasec.com/nvd/cve-2024-3817
publishedDate: "2024-04-17T20:15:08Z"
resource: github.com/hashicorp/go-getter
score: 9.8
severity: CRITICAL
target: ""
title: HashiCorp\u2019s go-getter library is vulnerable to argument injection
...
vulnerabilityID: CVE-2024-3817
- fixedVersion: 0.23.0
installedVersion: v0.22.0
lastModifiedDate: "2024-04-19T07:15:08Z"
links: []
primaryLink: https://avd.aquasec.com/nvd/cve-2023-45288
publishedDate: "2024-04-04T21:15:16Z"
resource: golang.org/x/net
score: 7.5
severity: MEDIUM
target: ""
title: 'golang: net/http, x/net/http2: unlimited number of CONTINUATION frames
causes DoS'
vulnerabilityID: CVE-2023-45288
- fixedVersion: ""
installedVersion: v3.14.2
lastModifiedDate: "2024-04-11T01:05:27Z"
links: []
primaryLink: https://avd.aquasec.com/nvd/cve-2019-25210
publishedDate: "2024-03-03T21:15:49Z"
resource: helm.sh/helm/v3
score: 6.5
severity: MEDIUM
target: ""
title: 'helm: shows secrets with --dry-run option in clear text'
vulnerabilityID: CVE-2019-25210
So I'm not sure why its still inconsistent.
@chheda-deshaw the three main things needed to be checked:
- imagePullSecret is define on workload
- secret with same name exist on resource namespace
- secret should be created in form of user/password
imagePullSecret is define on workload
Yes it is
secret with same name exist on resource namespace
It exists with the same name
secret should be created in form of user/password
It is not in a username/password format. But the current format works for pods to talk to artifactory. For eg every namespace has this secret with the same name (artifactory-login-cfg) to connect to the private registry for pods to download images and it works The format is:
apiVersion: v1
data:
.dockerconfigjson: <redacted>
kind: Secret
metadata:
creationTimestamp: "2023-08-01T06:50:54Z"
labels:
app.kubernetes.io/managed-by: k8s-selfserve.deshaw.com
k8s-selfserve.deshaw.com/namespace: chheda
prod.uidcheck.deshaw.com/enabled: "true"
name: artifactory-login-cfg
namespace: chheda
resourceVersion: "1010524728"
uid: 272d7896-015f-4f98-8d8c-9e184746ee61
type: kubernetes.io/dockerconfigjson
trivy support user/password. what is the format used in your secrets, can you add example how you create it ?
We use the image pull secret of type kubernetes.io/dockerconfigjson , which is basically embedding the username and password in the secret.
Steps on creating it are here: https://kubernetes.io/docs/tasks/configure-pod-container/pull-image-private-registry/
What exactly do you mean by user/password format?
if
We use the image pull secret of type
kubernetes.io/dockerconfigjson, which is basically embedding the username and password in the secret. Steps on creating it are here: https://kubernetes.io/docs/tasks/configure-pod-container/pull-image-private-registry/What exactly do you mean by user/password format?
if it is created this way then its ok:
kubectl create secret docker-registry regcred --docker-server=<your-registry-server> --docker-username=<your-name> --docker-password=<your-pword> --docker-email=<your-email>
Yes that is how it is created.
@chheda-deshaw can you please summarized the current status, some images are scanned and some has authentication issues ?
Yes precisely.
Here is the output of all the jobs run by the trivy operator for a few moments using command watch -n 2 'kubectl logs -l app.kubernetes.io/managed-by=trivy-operator -n trivy-system | tee -a logs.txt'
2024-04-25T07:09:47.370Z FATAL image scan error: scan error: unable to initialize a scanner: unable to initialize a remote image scanner: 4 errors
* docker error: unable to inspect the image (artifactory.deshaw.com/k8s-qa/mycsidriver:v0.3): Cannot connect to the Docker daemon at unix:///var/run
* containerd error: containerd socket not found: /run/containerd/containerd.sock
* podman error: unable to initialize Podman client: no podman socket found: stat podman/podman.sock: no such file or directory
* remote error: GET https://artifactory.deshaw.com/artifactory/api/docker/null/v2/token?scope=repository%3Ak8s-qa%2Fmycsidriver%3Apull&service=artif
2024-04-25T07:09:49.608Z FATAL image scan error: scan error: unable to initialize a scanner: unable to initialize a remote image scanner: 4 errors
* docker error: unable to inspect the image (artifactory.deshaw.com/k8s/burai/nname:v2): Cannot connect to the Docker daemon at unix:///var/run/dock
* containerd error: containerd socket not found: /run/containerd/containerd.sock
* podman error: unable to initialize Podman client: no podman socket found: stat podman/podman.sock: no such file or directory
* remote error: GET https://artifactory.deshaw.com/artifactory/api/docker/null/v2/token?scope=repository%3Ak8s%2Fburai%2Fnname%3Apull&service=artifa
2024-04-25T07:09:49.608Z FATAL image scan error: scan error: unable to initialize a scanner: unable to initialize a remote image scanner: 4 errors
* docker error: unable to inspect the image (artifactory.deshaw.com/k8s/burai/nname:v2): Cannot connect to the Docker daemon at unix:///var/run/dock
* containerd error: containerd socket not found: /run/containerd/containerd.sock
* podman error: unable to initialize Podman client: no podman socket found: stat podman/podman.sock: no such file or directory
* remote error: GET https://artifactory.deshaw.com/artifactory/api/docker/null/v2/token?scope=repository%3Ak8s%2Fburai%2Fnname%3Apull&service=artifa
2024-04-25T07:09:51.690Z FATAL image scan error: scan error: unable to initialize a scanner: unable to initialize a remote image scanner: 4 errors
* docker error: unable to inspect the image (artifactory.deshaw.com/k8s-qa/origprov-k8s:v2): Cannot connect to the Docker daemon at unix:///var/run/
* containerd error: containerd socket not found: /run/containerd/containerd.sock
* podman error: unable to initialize Podman client: no podman socket found: stat podman/podman.sock: no such file or directory
* remote error: GET https://artifactory.deshaw.com/artifactory/api/docker/null/v2/token?scope=repository%3Ak8s-qa%2Forigprov-k8s%3Apull&service=arti
2024-04-25T07:09:53.784Z FATAL image scan error: scan error: unable to initialize a scanner: unable to initialize a remote image scanner: 4 errors
* docker error: unable to inspect the image (artifactory.deshaw.com/k8s-qa/trymount:v4): Cannot connect to the Docker daemon at unix:///var/run/dock
* containerd error: containerd socket not found: /run/containerd/containerd.sock
* podman error: unable to initialize Podman client: no podman socket found: stat podman/podman.sock: no such file or directory
* remote error: GET https://artifactory.deshaw.com/artifactory/api/docker/null/v2/token?scope=repository%3Ak8s-qa%2Ftrymount%3Apull&service=artifact
MWoTO/n99mAImYKOdvvOfLmK47PP7U6oOEQNUArdtHdXDhvHBu5lQOuGI2o96jgCg3URiojVRgHo
RFyiCj6wqAZFQTC+5Gx0fb4DjvtejcxyZQkoKWpmmomkQEi5mopkF6GaKKnBZc0fq1dPTxyA59nJ
vRVacAoSCDhAJVWlEJSYB9K4sUR6M3oNmzh3aSa0UYdFiQCnGDv2PaClRyEP0S/X0WP3KCGEyf1P
D9cQDadIBzPgJuRObeUmvq+xEHnfZRR84Hs9TID8ZexEJODAEV7AKE1G0hQXByB7uVQQ79aghcud
jX9hRtiiEkkN1XOsQiGz7dKNQKnzBUy0mR8iJaK7RJC4gdelk5ooHCQhCInoIIdVeJO1YB08Yh7W
dc3JjjWUDVacvEgnSZaaFhH5ntFhywZEremJRjpwWnNo+9dCQXBS2odAVjp7TnerWjFF/whkZ4Ng
VhklTGohJmPLXP4XBsmA4oC6xRHIL54XlstaNIq4JsGKgWFkI883pKK+ZQ7CARgQJRHsFwiXY0BM
88FEflEEcFEaNtbOPQHotJ7YB4O0eQAV6g2BuVeLxwNa5g6vB0UM/ydQIEsCseqwMnLEXnFugrYO
WwCjAyEAcCniJ1nbyCWmhgRASJBV/4A/1AFeAgi/+UTKftH/qojwe6A6FBbqqiXBFuro4pJ+Ah//
F3JFOFCQQ3Di9Q==
MWoTO/n99mAImYKOdvvOfLmK47PP7U6oOEQNUArdtHdXDhvHBu5lQOuGI2o96jgCg3URiojVRgHo
RFyiCj6wqAZFQTC+5Gx0fb4DjvtejcxyZQkoKWpmmomkQEi5mopkF6GaKKnBZc0fq1dPTxyA59nJ
vRVacAoSCDhAJVWlEJSYB9K4sUR6M3oNmzh3aSa0UYdFiQCnGDv2PaClRyEP0S/X0WP3KCGEyf1P
D9cQDadIBzPgJuRObeUmvq+xEHnfZRR84Hs9TID8ZexEJODAEV7AKE1G0hQXByB7uVQQ79aghcud
jX9hRtiiEkkN1XOsQiGz7dKNQKnzBUy0mR8iJaK7RJC4gdelk5ooHCQhCInoIIdVeJO1YB08Yh7W
dc3JjjWUDVacvEgnSZaaFhH5ntFhywZEremJRjpwWnNo+9dCQXBS2odAVjp7TnerWjFF/whkZ4Ng
VhklTGohJmPLXP4XBsmA4oC6xRHIL54XlstaNIq4JsGKgWFkI883pKK+ZQ7CARgQJRHsFwiXY0BM
88FEflEEcFEaNtbOPQHotJ7YB4O0eQAV6g2BuVeLxwNa5g6vB0UM/ydQIEsCseqwMnLEXnFugrYO
WwCjAyEAcCniJ1nbyCWmhgRASJBV/4A/1AFeAgi/+UTKftH/qojwe6A6FBbqqiXBFuro4pJ+Ah//
F3JFOFCQQ3Di9Q==
MWoTO/n99mAImYKOdvvOfLmK47PP7U6oOEQNUArdtHdXDhvHBu5lQOuGI2o96jgCg3URiojVRgHo
RFyiCj6wqAZFQTC+5Gx0fb4DjvtejcxyZQkoKWpmmomkQEi5mopkF6GaKKnBZc0fq1dPTxyA59nJ
vRVacAoSCDhAJVWlEJSYB9K4sUR6M3oNmzh3aSa0UYdFiQCnGDv2PaClRyEP0S/X0WP3KCGEyf1P
D9cQDadIBzPgJuRObeUmvq+xEHnfZRR84Hs9TID8ZexEJODAEV7AKE1G0hQXByB7uVQQ79aghcud
jX9hRtiiEkkN1XOsQiGz7dKNQKnzBUy0mR8iJaK7RJC4gdelk5ooHCQhCInoIIdVeJO1YB08Yh7W
dc3JjjWUDVacvEgnSZaaFhH5ntFhywZEremJRjpwWnNo+9dCQXBS2odAVjp7TnerWjFF/whkZ4Ng
VhklTGohJmPLXP4XBsmA4oC6xRHIL54XlstaNIq4JsGKgWFkI883pKK+ZQ7CARgQJRHsFwiXY0BM
88FEflEEcFEaNtbOPQHotJ7YB4O0eQAV6g2BuVeLxwNa5g6vB0UM/ydQIEsCseqwMnLEXnFugrYO
WwCjAyEAcCniJ1nbyCWmhgRASJBV/4A/1AFeAgi/+UTKftH/qojwe6A6FBbqqiXBFuro4pJ+Ah//
F3JFOFCQQ3Di9Q==
NbvddLFGltKa9a4kvvdaQ0OC37X83X0wkYyCSfT+H1qqqqqqqqqrGMjIiKqoiqggxion69rGInba
qqqqiLossRFVEQQVVX1fLmUoUEREXi1WMWmZiqv82+BdnHCquBbZp2ymjQxlMhmKtlVXMlrV8nLa
EhAeBpGGmIam8wAchYC+cqBzBHQKClVsYXCNhYnkEwj4QouWRdfgxwiRhPTQ1tTSRBrMSZBdOleR
xIP+ka98e+oGtskWLSyWqbmuUUU87O2SRotYib39P+P7nkx+dJHz7HAjt3nDOZPVd/jCmDCSZYr+
Sh60PJ4r3lPqP44jtB831W2H0EMOg1Mi5cg3fKMVhEPfDqPCQTTIJ/w9fbzVmr91ut/LW8rj6KMm
5nJCxuEuRHiRDWK5kOZ6jt6zHeHpSa2EiwzdrvbscT4s5Xl+dP89T0eprYPCpVSqkrtWNBOogfHg
r4q8IW8lGhxojjephG0l/MWTXsjHjh9tKc0TgMknxhqyROTWBMEOCVJLnNgXl44GCyNWK8u6KiGe
SyrWEbVEMVFKWWalyykF5FF68oNibPicTJXMhlir5ThSt1eZR1hzO6IOQ0Z0wSNVwgSrb61lu4mt
YvRAgH4OLsT+pof0dQc0xxVE0M1RrNdiPnD839rreo7ZLD4B9XuCAdpoE9/skD9kRYiqqoq+8+D+
sD8QRYCqH/sP7WD+3/0AB/JXIyxxQJMcSRGIMtn/+LuSKcKEgxzn70A=
NbvddLFGltKa9a4kvvdaQ0OC37X83X0wkYyCSfT+H1qqqqqqqqqrGMjIiKqoiqggxion69rGInba
qqqqiLossRFVEQQVVX1fLmUoUEREXi1WMWmZiqv82+BdnHCquBbZp2ymjQxlMhmKtlVXMlrV8nLa
EhAeBpGGmIam8wAchYC+cqBzBHQKClVsYXCNhYnkEwj4QouWRdfgxwiRhPTQ1tTSRBrMSZBdOleR
xIP+ka98e+oGtskWLSyWqbmuUUU87O2SRotYib39P+P7nkx+dJHz7HAjt3nDOZPVd/jCmDCSZYr+
Sh60PJ4r3lPqP44jtB831W2H0EMOg1Mi5cg3fKMVhEPfDqPCQTTIJ/w9fbzVmr91ut/LW8rj6KMm
5nJCxuEuRHiRDWK5kOZ6jt6zHeHpSa2EiwzdrvbscT4s5Xl+dP89T0eprYPCpVSqkrtWNBOogfHg
r4q8IW8lGhxojjephG0l/MWTXsjHjh9tKc0TgMknxhqyROTWBMEOCVJLnNgXl44GCyNWK8u6KiGe
SyrWEbVEMVFKWWalyykF5FF68oNibPicTJXMhlir5ThSt1eZR1hzO6IOQ0Z0wSNVwgSrb61lu4mt
YvRAgH4OLsT+pof0dQc0xxVE0M1RrNdiPnD839rreo7ZLD4B9XuCAdpoE9/skD9kRYiqqoq+8+D+
sD8QRYCqH/sP7WD+3/0AB/JXIyxxQJMcSRGIMtn/+LuSKcKEgxzn70A=
wYJH1eJGeHiZGEu65FqFV5YWXTJClKVFuQGIQuXCF1bMbHMpWxYw0MCUVBubaJeYTNUZphdgxr+s
soZKMpdfJsXtMjZkwoVJq+tCy8Uja7nsvguefO95sxjx2ctMs58c+ZRtQI7LRgZDjIggsHIZl1Lk
gwkYo9MzMz5pmnw132tV6Mvd8PC8xBoOJMFIqhmAkebBDQaRNlmQsosscd2UJwc2TBIrwIsYHKUl
DCkYjEOMSEECguGGCU2hiI5LjCYQN7nGDcW+aONqNN8t2nLgYWRFP0oIREUsPEJyAxYobEIUXHR5
EXk6FBl7DGPtMuaXpY6yaZPZ9ns9yshgXQMlA7MF1dGWWWVsQ5UWLkBg2aFpGz2WAmRTdSVLS0EO
yqK+5m8cVT5PrOOZ7Ofa8cwgykhKG2TS7NRaVyTqPxUvqJrnaRZZsxlksRI+zOXXIkdUsRI7NylK
dFOy62mZo4TmbPY9vgVF2g1dEfpLoTQ1zREBqo8DzaIf0l7Y2MmdhWKKFlcapgcIQuO6OCiN66pJ
+UwTYuf9GVNxtJHJP71ST4pTIaNoQ0Dm5UOsLpelJ0KJzlRCTVyh++jD+MhiBRQkiXq0yMc/c4pM
ir0SHqfsPFPE+hayT0E2iSUCRHrLk3SIPBMiVH6vnH/1f+NAe5VUY/76h/+5Kev+/PQBYRISwkJi
g/+H/wweH/yLuSKcKEhTp6WBgA==
ZNYKhLhWSU0oimgQKgnrJZ4KDXywgEPg6pzCAEjgVALwcqBQ+wDvtIsTEKhIKDikzhJUF4hdiMj3
TY7fmO587Ozsm1QoPxiQLcdJq8g4OnWEwGUC66XPUQ6nc/fqKhkih9v7/MkkVVWMRFVUEFVVVVW0
oqqq+y3ylqoione3HNusVaYv7hrUmV8F717v0vuiprANOOnnhebYZmnvhUPyKNwBDwChEFCqgR8i
iZwQEorKKTXlCZ8bl1H3auc4aYyAyLIrGCsbAIsTVQzG1BM0VCiJeyUWLqLI9gQCKClCrFBaEAlJ
+EUT13688gBMgSSxWxtBByIFdKXvD9YqF4o4lfK54NzwB4eF48f6lQO/U5O7nGCPWDhEYliC/30D
SEJrylBYzB6II7rkEddBwNoK54mfAQqC6Afz24FogH1iAem3tuofvgSWVNTrZPHIoXKC/TEf7Lh/
vsxb2eRZh7mPuM9kN+pwFieqFBySDQb104saWRu0SXmyphFw1FoyHkFRVWXFFp+djNugJOV4IVei
WlCWFJhJIF3mZwwAo0C6rYAmLiI2CIoQECIKWVKET540yFgUPnIIhSKGnXmVEiNbeog8rVWZM3LI
Grm/ubwzYBKoKotF5SWQa/psOn43EpYc2fdHVqMjSoB5H+wH+FEHQURT/JEpT/AEf8UULDiQC6AD
dEAC4gl1dnJJPvEP/4u5IpwoSDDg0zaA
I guess the base64 bits mean successful jobs.
Here is the logs of trivy-operator:
{"level":"error","ts":"2024-04-25T07:16:03Z","msg":"Reconciler error","controller":"replicaset","controllerGroup":"apps","controllerKind":"ReplicaSet","ReplicaSet":{"name":"app-lb-test-5c687c544d","namespace":"hore"},"namespace":"hore","name":"app-lb-test-5c687c544d","reconcileID":"1d90dd86-0600-4365-a52e-19abd0d9c151","error":"evaluating resource: no policies found for kind: ReplicaSet","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:329\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:266\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:227"}
{"level":"error","ts":"2024-04-25T07:16:03Z","msg":"Reconciler error","controller":"replicaset","controllerGroup":"apps","controllerKind":"ReplicaSet","ReplicaSet":{"name":"app-lb-test-5c687c544d","namespace":"hore"},"namespace":"hore","name":"app-lb-test-5c687c544d","reconcileID":"bdfc5c5c-3a3a-4545-a6c2-7d2d5524f43e","error":"evaluating resource: no policies found for kind: ReplicaSet","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:329\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:266\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:227"}
{"level":"error","ts":"2024-04-25T07:16:03Z","msg":"Reconciler error","controller":"replicaset","controllerGroup":"apps","controllerKind":"ReplicaSet","ReplicaSet":{"name":"trymount-797598f88d","namespace":"hore"},"namespace":"hore","name":"trymount-797598f88d","reconcileID":"6b563b0c-32be-4094-9eb8-07f33ed6b1aa","error":"evaluating resource: no policies found for kind: ReplicaSet","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:329\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:266\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:227"}
{"level":"error","ts":"2024-04-25T07:16:04Z","msg":"Reconciler error","controller":"replicaset","controllerGroup":"apps","controllerKind":"ReplicaSet","ReplicaSet":{"name":"trymount-797598f88d","namespace":"hore"},"namespace":"hore","name":"trymount-797598f88d","reconcileID":"f4ccdc76-7685-4d7d-a7fb-407b46651cba","error":"evaluating resource: no policies found for kind: ReplicaSet","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:329\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:266\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:227"}
{"level":"error","ts":"2024-04-25T07:16:04Z","msg":"Reconciler error","controller":"replicaset","controllerGroup":"apps","controllerKind":"ReplicaSet","ReplicaSet":{"name":"csi-driver-6dbcccdf6f","namespace":"bmcsi-k8s"},"namespace":"bmcsi-k8s","name":"csi-driver-6dbcccdf6f","reconcileID":"e5ea5ad2-c93f-46f1-aa86-e7d02fd215a6","error":"evaluating resource: no policies found for kind: ReplicaSet","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:329\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:266\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:227"}
{"level":"error","ts":"2024-04-25T07:16:04Z","msg":"Reconciler error","controller":"replicaset","controllerGroup":"apps","controllerKind":"ReplicaSet","ReplicaSet":{"name":"csi-driver-6dbcccdf6f","namespace":"bmcsi-k8s"},"namespace":"bmcsi-k8s","name":"csi-driver-6dbcccdf6f","reconcileID":"ade0961c-b347-41f9-9812-ac7666674019","error":"evaluating resource: no policies found for kind: ReplicaSet","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:329\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:266\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:227"}
{"level":"error","ts":"2024-04-25T07:16:05Z","msg":"Reconciler error","controller":"cronjob","controllerGroup":"batch","controllerKind":"CronJob","CronJob":{"name":"descheduler","namespace":"kube-system"},"namespace":"kube-system","name":"descheduler","reconcileID":"fa8e15b6-97aa-4420-a757-2ce3e066e2c9","error":"evaluating resource: no policies found for kind: CronJob","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:329\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:266\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:227"}
{"level":"error","ts":"2024-04-25T07:16:05Z","msg":"Reconciler error","controller":"cronjob","controllerGroup":"batch","controllerKind":"CronJob","CronJob":{"name":"descheduler","namespace":"kube-system"},"namespace":"kube-system","name":"descheduler","reconcileID":"0a931d7f-0164-4576-b06e-23b5d2fe30df","error":"evaluating resource: no policies found for kind: CronJob","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:329\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:266\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:227"}
{"level":"error","ts":"2024-04-25T07:16:07Z","msg":"Reconciler error","controller":"replicaset","controllerGroup":"apps","controllerKind":"ReplicaSet","ReplicaSet":{"name":"helloworlds-677855f6f","namespace":"bharasoh"},"namespace":"bharasoh","name":"helloworlds-677855f6f","reconcileID":"aecd2610-7f3d-4f6c-af99-cfc4b34b8e1e","error":"evaluating resource: no policies found for kind: ReplicaSet","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:329\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:266\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:227"}
{"level":"error","ts":"2024-04-25T07:16:08Z","msg":"Reconciler error","controller":"replicaset","controllerGroup":"apps","controllerKind":"ReplicaSet","ReplicaSet":{"name":"helloworlds-677855f6f","namespace":"bharasoh"},"namespace":"bharasoh","name":"helloworlds-677855f6f","reconcileID":"a49617bb-b921-4fbe-84a0-cdf19f7a336a","error":"evaluating resource: no policies found for kind: ReplicaSet","stacktrace":"sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:329\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:266\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2\n\t/home/runner/go/pkg/mod/sigs.k8s.io/[email protected]/pkg/internal/controller/controller.go:227"}
@chheda-deshaw do you see vulnerabilities reports?
kubectl get vulns --all-namespaces -o wide
Yes
@chheda-deshaw can you take a look at one of the auth failing pods (containers) see if it match the three conditions I mention above ?
It does. Every namespace created in the cluster has the secret created and it is in user:pass format.
I was watching the operator for the past 1 hour and it has been trying to scan certain pods that are failing again and again.
6m56s Normal Created Pod/scan-vulnerabilityreport-59b7678f68-vhvxh Created container app-lb-test
6m54s Normal Pulled Pod/scan-vulnerabilityreport-848cdf7f7-dt9w8 Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
6m54s Normal Created Pod/scan-vulnerabilityreport-848cdf7f7-dt9w8 Created container csi-driver
6m54s Normal Started Pod/scan-vulnerabilityreport-848cdf7f7-dt9w8 Started container csi-driver
6m50s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-64f9df68d Job has reached the specified backoff limit
6m47s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-7bd778f4ff Job has reached the specified backoff limit
6m45s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-59b7678f68 Job has reached the specified backoff limit
6m43s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-848cdf7f7 Job has reached the specified backoff limit
6m24s Normal SuccessfulCreate Job/scan-vulnerabilityreport-64f9df68d Created pod: scan-vulnerabilityreport-64f9df68d-n692n
6m24s Normal Scheduled Pod/scan-vulnerabilityreport-64f9df68d-n692n Successfully assigned trivy-system/scan-vulnerabilityreport-64f
6m22s Normal Scheduled Pod/scan-vulnerabilityreport-7bd778f4ff-4jv8k Successfully assigned trivy-system/scan-vulnerabilityreport-7bd
6m22s Normal SuccessfulCreate Job/scan-vulnerabilityreport-7bd778f4ff Created pod: scan-vulnerabilityreport-7bd778f4ff-4jv8k
6m21s Normal SuccessfulCreate Job/scan-vulnerabilityreport-848cdf7f7 Created pod: scan-vulnerabilityreport-848cdf7f7-kfgdj
6m21s Normal Scheduled Pod/scan-vulnerabilityreport-848cdf7f7-kfgdj Successfully assigned trivy-system/scan-vulnerabilityreport-848
6m20s Normal Pulled Pod/scan-vulnerabilityreport-64f9df68d-n692n Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
6m20s Normal SuccessfulCreate Job/scan-vulnerabilityreport-59b7678f68 Created pod: scan-vulnerabilityreport-59b7678f68-ftdl2
6m20s Normal Scheduled Pod/scan-vulnerabilityreport-59b7678f68-ftdl2 Successfully assigned trivy-system/scan-vulnerabilityreport-59b
6m19s Normal Created Pod/scan-vulnerabilityreport-64f9df68d-n692n Created container csi-driver
6m19s Normal Started Pod/scan-vulnerabilityreport-64f9df68d-n692n Started container csi-driver
6m17s Normal Started Pod/scan-vulnerabilityreport-7bd778f4ff-4jv8k Started container trymount
6m17s Normal Created Pod/scan-vulnerabilityreport-7bd778f4ff-4jv8k Created container trymount
6m17s Normal Pulled Pod/scan-vulnerabilityreport-7bd778f4ff-4jv8k Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
6m15s Normal Pulled Pod/scan-vulnerabilityreport-848cdf7f7-kfgdj Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
6m15s Normal Started Pod/scan-vulnerabilityreport-848cdf7f7-kfgdj Started container csi-driver
6m15s Normal Created Pod/scan-vulnerabilityreport-848cdf7f7-kfgdj Created container csi-driver
6m13s Normal Started Pod/scan-vulnerabilityreport-59b7678f68-ftdl2 Started container app-lb-test
6m13s Normal Pulled Pod/scan-vulnerabilityreport-59b7678f68-ftdl2 Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
6m13s Normal Created Pod/scan-vulnerabilityreport-59b7678f68-ftdl2 Created container app-lb-test
6m9s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-64f9df68d Job has reached the specified backoff limit
6m6s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-7bd778f4ff Job has reached the specified backoff limit
6m4s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-59b7678f68 Job has reached the specified backoff limit
6m2s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-848cdf7f7 Job has reached the specified backoff limit
5m59s Normal SuccessfulCreate Job/scan-vulnerabilityreport-65767fcd48 Created pod: scan-vulnerabilityreport-65767fcd48-tcbcz
5m59s Normal Scheduled Pod/scan-vulnerabilityreport-65767fcd48-tcbcz Successfully assigned trivy-system/scan-vulnerabilityreport-657
5m55s Normal Created Pod/scan-vulnerabilityreport-65767fcd48-tcbcz Created container hello-world
5m55s Normal Started Pod/scan-vulnerabilityreport-65767fcd48-tcbcz Started container hello-world
5m55s Normal Pulled Pod/scan-vulnerabilityreport-65767fcd48-tcbcz Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
5m41s Normal SuccessfulCreate Job/scan-vulnerabilityreport-64f9df68d Created pod: scan-vulnerabilityreport-64f9df68d-w586v
5m41s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-65767fcd48 Job has reached the specified backoff limit
5m41s Normal Scheduled Pod/scan-vulnerabilityreport-64f9df68d-w586v Successfully assigned trivy-system/scan-vulnerabilityreport-64f
5m40s Normal SuccessfulCreate Job/scan-vulnerabilityreport-7bd778f4ff Created pod: scan-vulnerabilityreport-7bd778f4ff-bflhc
5m40s Normal Scheduled Pod/scan-vulnerabilityreport-7bd778f4ff-bflhc Successfully assigned trivy-system/scan-vulnerabilityreport-7bd
5m37s Normal SuccessfulCreate Job/scan-vulnerabilityreport-59b7678f68 Created pod: scan-vulnerabilityreport-59b7678f68-skzzh
5m37s Normal Pulled Pod/scan-vulnerabilityreport-64f9df68d-w586v Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
5m37s Normal Scheduled Pod/scan-vulnerabilityreport-59b7678f68-skzzh Successfully assigned trivy-system/scan-vulnerabilityreport-59b
5m36s Normal SuccessfulCreate Job/scan-vulnerabilityreport-848cdf7f7 Created pod: scan-vulnerabilityreport-848cdf7f7-6zh6p
5m36s Normal Created Pod/scan-vulnerabilityreport-64f9df68d-w586v Created container csi-driver
5m36s Normal Started Pod/scan-vulnerabilityreport-64f9df68d-w586v Started container csi-driver
5m36s Normal Scheduled Pod/scan-vulnerabilityreport-848cdf7f7-6zh6p Successfully assigned trivy-system/scan-vulnerabilityreport-848
5m34s Normal Started Pod/scan-vulnerabilityreport-7bd778f4ff-bflhc Started container trymount
5m34s Normal Created Pod/scan-vulnerabilityreport-7bd778f4ff-bflhc Created container trymount
5m34s Normal Pulled Pod/scan-vulnerabilityreport-7bd778f4ff-bflhc Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
5m34s Normal SuccessfulCreate Job/scan-vulnerabilityreport-dc6b86f4c Created pod: scan-vulnerabilityreport-dc6b86f4c-4p65b
5m34s Normal Scheduled Pod/scan-vulnerabilityreport-dc6b86f4c-4p65b Successfully assigned trivy-system/scan-vulnerabilityreport-dc6
5m32s Normal Started Pod/scan-vulnerabilityreport-59b7678f68-skzzh Started container app-lb-test
5m32s Normal Created Pod/scan-vulnerabilityreport-59b7678f68-skzzh Created container app-lb-test
5m32s Normal Pulled Pod/scan-vulnerabilityreport-59b7678f68-skzzh Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
5m30s Normal Started Pod/scan-vulnerabilityreport-848cdf7f7-6zh6p Started container csi-driver
5m30s Normal Pulled Pod/scan-vulnerabilityreport-848cdf7f7-6zh6p Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
5m30s Normal Created Pod/scan-vulnerabilityreport-848cdf7f7-6zh6p Created container csi-driver
5m28s Normal SuccessfulCreate Job/scan-vulnerabilityreport-789fb68f47 Created pod: scan-vulnerabilityreport-789fb68f47-hddfd
5m28s Normal Pulled Pod/scan-vulnerabilityreport-dc6b86f4c-4p65b Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
5m28s Normal Created Pod/scan-vulnerabilityreport-dc6b86f4c-4p65b Created container helloworlds
5m28s Normal Started Pod/scan-vulnerabilityreport-dc6b86f4c-4p65b Started container helloworlds
5m28s Normal Scheduled Pod/scan-vulnerabilityreport-789fb68f47-hddfd Successfully assigned trivy-system/scan-vulnerabilityreport-789
5m23s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-7bd778f4ff Job has reached the specified backoff limit
5m20s Normal Started Pod/scan-vulnerabilityreport-789fb68f47-hddfd Started container windows-test
5m20s Normal Pulled Pod/scan-vulnerabilityreport-789fb68f47-hddfd Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
5m20s Normal Created Pod/scan-vulnerabilityreport-789fb68f47-hddfd Created container windows-test
5m13s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-848cdf7f7 Job has reached the specified backoff limit
5m11s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-59b7678f68 Job has reached the specified backoff limit
5m8s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-dc6b86f4c Job has reached the specified backoff limit
5m6s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-789fb68f47 Job has reached the specified backoff limit
5m4s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-64f9df68d Job has reached the specified backoff limit
5m1s Normal SuccessfulCreate Job/scan-vulnerabilityreport-dc6b86f4c Created pod: scan-vulnerabilityreport-dc6b86f4c-mmn6v
5m1s Normal Scheduled Pod/scan-vulnerabilityreport-dc6b86f4c-mmn6v Successfully assigned trivy-system/scan-vulnerabilityreport-dc6
4m59s Normal SuccessfulCreate Job/scan-vulnerabilityreport-59b7678f68 Created pod: scan-vulnerabilityreport-59b7678f68-v8lc9
4m59s Normal Scheduled Pod/scan-vulnerabilityreport-59b7678f68-v8lc9 Successfully assigned trivy-system/scan-vulnerabilityreport-59b
4m57s Normal SuccessfulCreate Job/scan-vulnerabilityreport-7bd778f4ff Created pod: scan-vulnerabilityreport-7bd778f4ff-h82wn
4m57s Normal Scheduled Pod/scan-vulnerabilityreport-7bd778f4ff-h82wn Successfully assigned trivy-system/scan-vulnerabilityreport-7bd
4m56s Normal Started Pod/scan-vulnerabilityreport-dc6b86f4c-mmn6v Started container helloworlds
4m56s Normal SuccessfulCreate Job/scan-vulnerabilityreport-848cdf7f7 Created pod: scan-vulnerabilityreport-848cdf7f7-9zdvq
4m56s Normal Created Pod/scan-vulnerabilityreport-dc6b86f4c-mmn6v Created container helloworlds
4m56s Normal Pulled Pod/scan-vulnerabilityreport-dc6b86f4c-mmn6v Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
4m56s Normal Scheduled Pod/scan-vulnerabilityreport-848cdf7f7-9zdvq Successfully assigned trivy-system/scan-vulnerabilityreport-848
4m55s Normal SuccessfulCreate Job/scan-vulnerabilityreport-64f9df68d Created pod: scan-vulnerabilityreport-64f9df68d-f4mj6
4m54s Normal Scheduled Pod/scan-vulnerabilityreport-64f9df68d-f4mj6 Successfully assigned trivy-system/scan-vulnerabilityreport-64f
4m54s Normal Started Pod/scan-vulnerabilityreport-59b7678f68-v8lc9 Started container app-lb-test
4m54s Normal Created Pod/scan-vulnerabilityreport-59b7678f68-v8lc9 Created container app-lb-test
4m54s Normal Pulled Pod/scan-vulnerabilityreport-59b7678f68-v8lc9 Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
4m52s Normal Started Pod/scan-vulnerabilityreport-7bd778f4ff-h82wn Started container trymount
4m52s Normal Created Pod/scan-vulnerabilityreport-7bd778f4ff-h82wn Created container trymount
4m52s Normal Pulled Pod/scan-vulnerabilityreport-7bd778f4ff-h82wn Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
4m50s Normal Pulled Pod/scan-vulnerabilityreport-848cdf7f7-9zdvq Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
4m50s Normal Started Pod/scan-vulnerabilityreport-848cdf7f7-9zdvq Started container csi-driver
4m50s Normal Created Pod/scan-vulnerabilityreport-848cdf7f7-9zdvq Created container csi-driver
4m48s Normal Started Pod/scan-vulnerabilityreport-64f9df68d-f4mj6 Started container csi-driver
4m48s Normal Created Pod/scan-vulnerabilityreport-64f9df68d-f4mj6 Created container csi-driver
4m48s Normal Pulled Pod/scan-vulnerabilityreport-64f9df68d-f4mj6 Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
4m44s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-dc6b86f4c Job has reached the specified backoff limit
4m42s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-848cdf7f7 Job has reached the specified backoff limit
4m39s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-7bd778f4ff Job has reached the specified backoff limit
4m37s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-59b7678f68 Job has reached the specified backoff limit
4m34s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-64f9df68d Job has reached the specified backoff limit
4m29s Normal SuccessfulCreate Job/scan-vulnerabilityreport-dc6b86f4c Created pod: scan-vulnerabilityreport-dc6b86f4c-q2qbt
4m29s Normal Scheduled Pod/scan-vulnerabilityreport-dc6b86f4c-q2qbt Successfully assigned trivy-system/scan-vulnerabilityreport-dc6
4m25s Normal Created Pod/scan-vulnerabilityreport-dc6b86f4c-q2qbt Created container helloworlds
4m25s Normal Started Pod/scan-vulnerabilityreport-dc6b86f4c-q2qbt Started container helloworlds
4m25s Normal Pulled Pod/scan-vulnerabilityreport-dc6b86f4c-q2qbt Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
4m16s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-dc6b86f4c Job has reached the specified backoff limit
4m16s Normal SuccessfulCreate Job/scan-vulnerabilityreport-64f9df68d Created pod: scan-vulnerabilityreport-64f9df68d-5759p
4m16s Normal Scheduled Pod/scan-vulnerabilityreport-64f9df68d-5759p Successfully assigned trivy-system/scan-vulnerabilityreport-64f
4m15s Normal SuccessfulCreate Job/scan-vulnerabilityreport-59b7678f68 Created pod: scan-vulnerabilityreport-59b7678f68-6d5p7
4m15s Normal Scheduled Pod/scan-vulnerabilityreport-59b7678f68-6d5p7 Successfully assigned trivy-system/scan-vulnerabilityreport-59b
4m13s Normal SuccessfulCreate Job/scan-vulnerabilityreport-7bd778f4ff Created pod: scan-vulnerabilityreport-7bd778f4ff-26xrj
4m13s Normal Scheduled Pod/scan-vulnerabilityreport-7bd778f4ff-26xrj Successfully assigned trivy-system/scan-vulnerabilityreport-7bd
4m12s Normal Created Pod/scan-vulnerabilityreport-64f9df68d-5759p Created container csi-driver
4m12s Normal SuccessfulCreate Job/scan-vulnerabilityreport-848cdf7f7 Created pod: scan-vulnerabilityreport-848cdf7f7-vnwqh
4m12s Normal Pulled Pod/scan-vulnerabilityreport-64f9df68d-5759p Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
4m12s Normal Started Pod/scan-vulnerabilityreport-64f9df68d-5759p Started container csi-driver
4m12s Normal Scheduled Pod/scan-vulnerabilityreport-848cdf7f7-vnwqh Successfully assigned trivy-system/scan-vulnerabilityreport-848
4m10s Normal Created Pod/scan-vulnerabilityreport-59b7678f68-6d5p7 Created container app-lb-test
4m10s Normal Pulled Pod/scan-vulnerabilityreport-59b7678f68-6d5p7 Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
4m9s Normal Started Pod/scan-vulnerabilityreport-59b7678f68-6d5p7 Started container app-lb-test
4m7s Normal Created Pod/scan-vulnerabilityreport-7bd778f4ff-26xrj Created container trymount
4m7s Normal Pulled Pod/scan-vulnerabilityreport-7bd778f4ff-26xrj Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
4m7s Normal Started Pod/scan-vulnerabilityreport-7bd778f4ff-26xrj Started container trymount
4m5s Normal Created Pod/scan-vulnerabilityreport-848cdf7f7-vnwqh Created container csi-driver
4m5s Normal Pulled Pod/scan-vulnerabilityreport-848cdf7f7-vnwqh Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
4m5s Normal Started Pod/scan-vulnerabilityreport-848cdf7f7-vnwqh Started container csi-driver
4m Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-64f9df68d Job has reached the specified backoff limit
3m58s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-59b7678f68 Job has reached the specified backoff limit
3m56s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-7bd778f4ff Job has reached the specified backoff limit
3m53s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-848cdf7f7 Job has reached the specified backoff limit
3m34s Normal SuccessfulCreate Job/scan-vulnerabilityreport-59b7678f68 Created pod: scan-vulnerabilityreport-59b7678f68-8fqgj
3m33s Normal Scheduled Pod/scan-vulnerabilityreport-59b7678f68-8fqgj Successfully assigned trivy-system/scan-vulnerabilityreport-59b
3m32s Normal SuccessfulCreate Job/scan-vulnerabilityreport-dc6b86f4c Created pod: scan-vulnerabilityreport-dc6b86f4c-4qqbn
3m32s Normal Scheduled Pod/scan-vulnerabilityreport-dc6b86f4c-4qqbn Successfully assigned trivy-system/scan-vulnerabilityreport-dc6
3m31s Normal SuccessfulCreate Job/scan-vulnerabilityreport-64f9df68d Created pod: scan-vulnerabilityreport-64f9df68d-h4pg7
3m31s Normal Scheduled Pod/scan-vulnerabilityreport-64f9df68d-h4pg7 Successfully assigned trivy-system/scan-vulnerabilityreport-64f
3m29s Normal Started Pod/scan-vulnerabilityreport-59b7678f68-8fqgj Started container app-lb-test
3m29s Normal Created Pod/scan-vulnerabilityreport-59b7678f68-8fqgj Created container app-lb-test
3m29s Normal Pulled Pod/scan-vulnerabilityreport-59b7678f68-8fqgj Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
3m28s Normal SuccessfulCreate Job/scan-vulnerabilityreport-848cdf7f7 Created pod: scan-vulnerabilityreport-848cdf7f7-wtzbf
3m28s Normal Scheduled Pod/scan-vulnerabilityreport-848cdf7f7-wtzbf Successfully assigned trivy-system/scan-vulnerabilityreport-848
3m27s Normal Pulled Pod/scan-vulnerabilityreport-dc6b86f4c-4qqbn Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
3m27s Normal Started Pod/scan-vulnerabilityreport-dc6b86f4c-4qqbn Started container helloworlds
3m27s Normal Created Pod/scan-vulnerabilityreport-dc6b86f4c-4qqbn Created container helloworlds
3m26s Normal SuccessfulCreate Job/scan-vulnerabilityreport-7bd778f4ff Created pod: scan-vulnerabilityreport-7bd778f4ff-xfkww
3m26s Normal Scheduled Pod/scan-vulnerabilityreport-7bd778f4ff-xfkww Successfully assigned trivy-system/scan-vulnerabilityreport-7bd
3m25s Normal Started Pod/scan-vulnerabilityreport-64f9df68d-h4pg7 Started container csi-driver
3m25s Normal Created Pod/scan-vulnerabilityreport-64f9df68d-h4pg7 Created container csi-driver
3m25s Normal Pulled Pod/scan-vulnerabilityreport-64f9df68d-h4pg7 Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
3m23s Normal Started Pod/scan-vulnerabilityreport-848cdf7f7-wtzbf Started container csi-driver
3m23s Normal Created Pod/scan-vulnerabilityreport-848cdf7f7-wtzbf Created container csi-driver
3m23s Normal Pulled Pod/scan-vulnerabilityreport-848cdf7f7-wtzbf Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
3m18s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-59b7678f68 Job has reached the specified backoff limit
3m18s Normal Created Pod/scan-vulnerabilityreport-7bd778f4ff-xfkww Created container trymount
3m18s Normal Started Pod/scan-vulnerabilityreport-7bd778f4ff-xfkww Started container trymount
3m18s Normal Pulled Pod/scan-vulnerabilityreport-7bd778f4ff-xfkww Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
3m14s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-dc6b86f4c Job has reached the specified backoff limit
3m12s Normal SuccessfulCreate Job/scan-vulnerabilityreport-65767fcd48 Created pod: scan-vulnerabilityreport-65767fcd48-vmshq
3m11s Normal Scheduled Pod/scan-vulnerabilityreport-65767fcd48-vmshq Successfully assigned trivy-system/scan-vulnerabilityreport-657
3m11s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-848cdf7f7 Job has reached the specified backoff limit
3m9s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-64f9df68d Job has reached the specified backoff limit
3m7s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-7bd778f4ff Job has reached the specified backoff limit
3m6s Normal Pulled Pod/scan-vulnerabilityreport-65767fcd48-vmshq Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
3m6s Normal Started Pod/scan-vulnerabilityreport-65767fcd48-vmshq Started container hello-world
3m6s Normal Created Pod/scan-vulnerabilityreport-65767fcd48-vmshq Created container hello-world
2m52s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-65767fcd48 Job has reached the specified backoff limit
2m51s Normal SuccessfulCreate Job/scan-vulnerabilityreport-848cdf7f7 Created pod: scan-vulnerabilityreport-848cdf7f7-d5z7m
2m51s Normal Scheduled Pod/scan-vulnerabilityreport-848cdf7f7-d5z7m Successfully assigned trivy-system/scan-vulnerabilityreport-848
2m50s Normal SuccessfulCreate Job/scan-vulnerabilityreport-59b7678f68 Created pod: scan-vulnerabilityreport-59b7678f68-r2bs6
2m49s Normal Scheduled Pod/scan-vulnerabilityreport-59b7678f68-r2bs6 Successfully assigned trivy-system/scan-vulnerabilityreport-59b
2m48s Normal SuccessfulCreate Job/scan-vulnerabilityreport-7bd778f4ff Created pod: scan-vulnerabilityreport-7bd778f4ff-d2bvh
2m48s Normal Scheduled Pod/scan-vulnerabilityreport-7bd778f4ff-d2bvh Successfully assigned trivy-system/scan-vulnerabilityreport-7bd
2m47s Normal SuccessfulCreate Job/scan-vulnerabilityreport-64f9df68d Created pod: scan-vulnerabilityreport-64f9df68d-j9btm
2m47s Normal Pulled Pod/scan-vulnerabilityreport-848cdf7f7-d5z7m Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
2m47s Normal Scheduled Pod/scan-vulnerabilityreport-64f9df68d-j9btm Successfully assigned trivy-system/scan-vulnerabilityreport-64f
2m46s Normal Started Pod/scan-vulnerabilityreport-848cdf7f7-d5z7m Started container csi-driver
2m46s Normal Created Pod/scan-vulnerabilityreport-848cdf7f7-d5z7m Created container csi-driver
2m45s Normal Scheduled Pod/scan-vulnerabilityreport-dc6b86f4c-5ntjf Successfully assigned trivy-system/scan-vulnerabilityreport-dc6
2m45s Normal SuccessfulCreate Job/scan-vulnerabilityreport-dc6b86f4c Created pod: scan-vulnerabilityreport-dc6b86f4c-5ntjf
2m44s Normal Started Pod/scan-vulnerabilityreport-59b7678f68-r2bs6 Started container app-lb-test
2m44s Normal Pulled Pod/scan-vulnerabilityreport-59b7678f68-r2bs6 Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
2m44s Normal Created Pod/scan-vulnerabilityreport-59b7678f68-r2bs6 Created container app-lb-test
2m42s Normal Started Pod/scan-vulnerabilityreport-7bd778f4ff-d2bvh Started container trymount
2m42s Normal Created Pod/scan-vulnerabilityreport-7bd778f4ff-d2bvh Created container trymount
2m42s Normal Pulled Pod/scan-vulnerabilityreport-7bd778f4ff-d2bvh Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
2m40s Normal Pulled Pod/scan-vulnerabilityreport-dc6b86f4c-5ntjf Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
2m40s Normal Started Pod/scan-vulnerabilityreport-dc6b86f4c-5ntjf Started container helloworlds
2m40s Normal Created Pod/scan-vulnerabilityreport-dc6b86f4c-5ntjf Created container helloworlds
2m38s Normal Started Pod/scan-vulnerabilityreport-64f9df68d-j9btm Started container csi-driver
2m38s Normal Created Pod/scan-vulnerabilityreport-64f9df68d-j9btm Created container csi-driver
2m38s Normal Pulled Pod/scan-vulnerabilityreport-64f9df68d-j9btm Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
2m33s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-848cdf7f7 Job has reached the specified backoff limit
2m31s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-59b7678f68 Job has reached the specified backoff limit
2m29s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-7bd778f4ff Job has reached the specified backoff limit
2m26s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-dc6b86f4c Job has reached the specified backoff limit
2m23s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-64f9df68d Job has reached the specified backoff limit
2m8s Normal SuccessfulCreate Job/scan-vulnerabilityreport-59b7678f68 Created pod: scan-vulnerabilityreport-59b7678f68-tm66m
2m8s Normal Scheduled Pod/scan-vulnerabilityreport-59b7678f68-tm66m Successfully assigned trivy-system/scan-vulnerabilityreport-59b
2m7s Normal SuccessfulCreate Job/scan-vulnerabilityreport-848cdf7f7 Created pod: scan-vulnerabilityreport-848cdf7f7-kzlhj
2m6s Normal Scheduled Pod/scan-vulnerabilityreport-848cdf7f7-kzlhj Successfully assigned trivy-system/scan-vulnerabilityreport-848
2m5s Normal SuccessfulCreate Job/scan-vulnerabilityreport-64f9df68d Created pod: scan-vulnerabilityreport-64f9df68d-t9pqm
2m5s Normal Scheduled Pod/scan-vulnerabilityreport-64f9df68d-t9pqm Successfully assigned trivy-system/scan-vulnerabilityreport-64f
2m4s Normal SuccessfulCreate Job/scan-vulnerabilityreport-7bd778f4ff Created pod: scan-vulnerabilityreport-7bd778f4ff-nrsgd
2m4s Normal Pulled Pod/scan-vulnerabilityreport-59b7678f68-tm66m Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
2m4s Normal Scheduled Pod/scan-vulnerabilityreport-7bd778f4ff-nrsgd Successfully assigned trivy-system/scan-vulnerabilityreport-7bd
2m3s Normal Created Pod/scan-vulnerabilityreport-59b7678f68-tm66m Created container app-lb-test
2m3s Normal Started Pod/scan-vulnerabilityreport-59b7678f68-tm66m Started container app-lb-test
2m1s Normal Pulled Pod/scan-vulnerabilityreport-848cdf7f7-kzlhj Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
2m1s Normal Created Pod/scan-vulnerabilityreport-848cdf7f7-kzlhj Created container csi-driver
2m1s Normal Started Pod/scan-vulnerabilityreport-848cdf7f7-kzlhj Started container csi-driver
119s Normal Started Pod/scan-vulnerabilityreport-64f9df68d-t9pqm Started container csi-driver
119s Normal Created Pod/scan-vulnerabilityreport-64f9df68d-t9pqm Created container csi-driver
119s Normal Pulled Pod/scan-vulnerabilityreport-64f9df68d-t9pqm Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
117s Normal Started Pod/scan-vulnerabilityreport-7bd778f4ff-nrsgd Started container trymount
117s Normal Created Pod/scan-vulnerabilityreport-7bd778f4ff-nrsgd Created container trymount
117s Normal Pulled Pod/scan-vulnerabilityreport-7bd778f4ff-nrsgd Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
112s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-59b7678f68 Job has reached the specified backoff limit
110s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-848cdf7f7 Job has reached the specified backoff limit
108s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-64f9df68d Job has reached the specified backoff limit
105s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-7bd778f4ff Job has reached the specified backoff limit
85s Normal SuccessfulCreate Job/scan-vulnerabilityreport-7bd778f4ff Created pod: scan-vulnerabilityreport-7bd778f4ff-47r7t
85s Normal Scheduled Pod/scan-vulnerabilityreport-7bd778f4ff-47r7t Successfully assigned trivy-system/scan-vulnerabilityreport-7bd
84s Normal SuccessfulCreate Job/scan-vulnerabilityreport-848cdf7f7 Created pod: scan-vulnerabilityreport-848cdf7f7-tdhnt
83s Normal Scheduled Pod/scan-vulnerabilityreport-848cdf7f7-tdhnt Successfully assigned trivy-system/scan-vulnerabilityreport-848
82s Normal SuccessfulCreate Job/scan-vulnerabilityreport-59b7678f68 Created pod: scan-vulnerabilityreport-59b7678f68-nt2gn
82s Normal Scheduled Pod/scan-vulnerabilityreport-59b7678f68-nt2gn Successfully assigned trivy-system/scan-vulnerabilityreport-59b
81s Normal Pulled Pod/scan-vulnerabilityreport-7bd778f4ff-47r7t Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
81s Normal SuccessfulCreate Job/scan-vulnerabilityreport-64f9df68d Created pod: scan-vulnerabilityreport-64f9df68d-67wkh
81s Normal Started Pod/scan-vulnerabilityreport-7bd778f4ff-47r7t Started container trymount
81s Normal Created Pod/scan-vulnerabilityreport-7bd778f4ff-47r7t Created container trymount
81s Normal Scheduled Pod/scan-vulnerabilityreport-64f9df68d-67wkh Successfully assigned trivy-system/scan-vulnerabilityreport-64f
79s Normal Started Pod/scan-vulnerabilityreport-848cdf7f7-tdhnt Started container csi-driver
79s Normal Created Pod/scan-vulnerabilityreport-848cdf7f7-tdhnt Created container csi-driver
79s Normal Pulled Pod/scan-vulnerabilityreport-848cdf7f7-tdhnt Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
77s Normal Created Pod/scan-vulnerabilityreport-59b7678f68-nt2gn Created container app-lb-test
77s Normal Pulled Pod/scan-vulnerabilityreport-59b7678f68-nt2gn Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
77s Normal Started Pod/scan-vulnerabilityreport-59b7678f68-nt2gn Started container app-lb-test
75s Normal Pulled Pod/scan-vulnerabilityreport-64f9df68d-67wkh Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:
74s Normal Created Pod/scan-vulnerabilityreport-64f9df68d-67wkh Created container csi-driver
74s Normal Started Pod/scan-vulnerabilityreport-64f9df68d-67wkh Started container csi-driver
70s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-7bd778f4ff Job has reached the specified backoff limit
68s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-59b7678f68 Job has reached the specified backoff limit
65s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-848cdf7f7 Job has reached the specified backoff limit
63s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-64f9df68d Job has reached the specified backoff limit
43s Normal SuccessfulCreate Job/scan-vulnerabilityreport-59b7678f68 Created pod: scan-vulnerabilityreport-59b7678f68-k6zc6
42s Normal Scheduled Pod/scan-vulnerabilityreport-59b7678f68-k6zc6 Successfully assigned trivy-system/scan-vulnerabilityreport-59b
41s Normal SuccessfulCreate Job/scan-vulnerabilityreport-64f9df68d Created pod: scan-vulnerabilityreport-64f9df68d-x6hbr
41s Normal Scheduled Pod/scan-vulnerabilityreport-64f9df68d-x6hbr Successfully assigned trivy-system/scan-vulnerabilityreport-64f
40s Normal SuccessfulCreate Job/scan-vulnerabilityreport-7bd778f4ff Created pod: scan-vulnerabilityreport-7bd778f4ff-gz6tz
40s Normal Scheduled Pod/scan-vulnerabilityreport-7bd778f4ff-gz6tz Successfully assigned trivy-system/scan-vulnerabilityreport-7bd778f4ff-gz6tz to hdc1webwrkqa1.k8s.des.co
38s Normal Scheduled Pod/scan-vulnerabilityreport-848cdf7f7-wxh6r Successfully assigned trivy-system/scan-vulnerabilityreport-848cdf7f7-wxh6r to hdc1webwrkqa1.k8s.des.co
38s Normal SuccessfulCreate Job/scan-vulnerabilityreport-848cdf7f7 Created pod: scan-vulnerabilityreport-848cdf7f7-wxh6r
38s Normal Created Pod/scan-vulnerabilityreport-59b7678f68-k6zc6 Created container app-lb-test
38s Normal Started Pod/scan-vulnerabilityreport-59b7678f68-k6zc6 Started container app-lb-test
38s Normal Pulled Pod/scan-vulnerabilityreport-59b7678f68-k6zc6 Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:0.50.1" already present on machine
35s Normal Created Pod/scan-vulnerabilityreport-64f9df68d-x6hbr Created container csi-driver
35s Normal Started Pod/scan-vulnerabilityreport-64f9df68d-x6hbr Started container csi-driver
35s Normal Pulled Pod/scan-vulnerabilityreport-64f9df68d-x6hbr Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:0.50.1" already present on machine
33s Normal Started Pod/scan-vulnerabilityreport-7bd778f4ff-gz6tz Started container trymount
33s Normal Created Pod/scan-vulnerabilityreport-7bd778f4ff-gz6tz Created container trymount
33s Normal Pulled Pod/scan-vulnerabilityreport-7bd778f4ff-gz6tz Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:0.50.1" already present on machine
31s Normal Started Pod/scan-vulnerabilityreport-848cdf7f7-wxh6r Started container csi-driver
31s Normal Pulled Pod/scan-vulnerabilityreport-848cdf7f7-wxh6r Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:0.50.1" already present on machine
31s Normal Created Pod/scan-vulnerabilityreport-848cdf7f7-wxh6r Created container csi-driver
30s Normal SuccessfulCreate Job/scan-vulnerabilityreport-dc6b86f4c Created pod: scan-vulnerabilityreport-dc6b86f4c-hftl4
30s Normal Scheduled Pod/scan-vulnerabilityreport-dc6b86f4c-hftl4 Successfully assigned trivy-system/scan-vulnerabilityreport-dc6b86f4c-hftl4 to hdc1webwrkqa1.k8s.des.co
27s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-59b7678f68 Job has reached the specified backoff limit
24s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-64f9df68d Job has reached the specified backoff limit
22s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-7bd778f4ff Job has reached the specified backoff limit
19s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-848cdf7f7 Job has reached the specified backoff limit
19s Normal Started Pod/scan-vulnerabilityreport-dc6b86f4c-hftl4 Started container helloworlds
19s Normal Created Pod/scan-vulnerabilityreport-dc6b86f4c-hftl4 Created container helloworlds
19s Normal Pulled Pod/scan-vulnerabilityreport-dc6b86f4c-hftl4 Container image "artifactory.deshaw.com/k8s/aquasecurity/trivy:0.50.1" already present on machine
10s Warning BackoffLimitExceeded Job/scan-vulnerabilityreport-dc6b86f4c Job has reached the specified backoff limit
I had set the backoffLimit to 0.
If you see the events csi-driver, trymount, app-lb-test etc are being scanned repeatedly