gatekeeper
gatekeeper copied to clipboard
OPA Gateekeeper violations not getting reported
What steps did you take and what happened: [A clear and concise description of what the bug is.]
Hi, I am using gatekeeper to enforce some policies in our k8s infrastructure. We are using the following template
---
apiVersion: templates.gatekeeper.sh/v1
kind: ConstraintTemplate
metadata:
name: k8senforcepaastacontract
annotations:
description: >-
Requires resources to contain specified labels, with values matching
provided regular expressions.
spec:
crd:
spec:
names:
kind: K8sEnforcePaastaContract
validation:
openAPIV3Schema:
type: object
properties:
labels:
type: array
description: >-
A list of labels and values the object must specify.
items:
type: object
properties:
key:
type: string
description: >-
The required label.
allowedRegex:
type: string
description: >-
If specified, a regular expression the annotation's value
must match. The value must contain at least one match for
the regular expression.
targets:
- target: admission.k8s.gatekeeper.sh
rego: |
package k8senforcepaastacontract
violation[{"msg": msg, "details": {"missing_labels": missing_labels, "owning_team": team}}] {
team := input.review.object.metadata.labels["yelp.com/owner"]
resource_name := input.review.object.metadata.name
resource_type := input.review.object.kind
provided := {label | input.review.object.metadata.labels[label]}
required := {label | label := input.parameters.labels[_].key}
missing_labels := required - provided
count(missing_labels) > 0
msg := sprintf("Missing labels for resource=%v of type=%v. Owning team=%v needs to add them.",
[resource_name, resource_type, team])
}
violation[{"msg": msg}] {
value := input.review.object.metadata.labels[key]
expected := input.parameters.labels[_]
expected.key == key
# do not match if allowedRegex is not defined, or is an empty string
expected.allowedRegex != ""
not re_match(expected.allowedRegex, value)
msg := sprintf("Label <%v: %v> does not satisfy allowed regex: %v", [key, value, expected.allowedRegex])
}
and this is the corresponding template is
---
apiVersion: constraints.gatekeeper.sh/v1beta1
kind: K8sEnforcePaastaContract
metadata:
name: pods-follow-paasta-contract
namespace: gatekeeper-system
labels:
yelp.com/owner: compute_infra_core
spec:
enforcementAction: warn
match:
excludedNamespaces:
- kube-*
- gatekeeper-*
- flux-*
kinds:
- apiGroups: [""]
kinds: ["Pod"]
parameters:
labels:
- key: paasta.yelp.com/pool
allowedRegex: "^[a-zA-Z_-]{3,}"
- key: paasta.yelp.com/cluster
allowedRegex: "^[a-zA-Z_-]{3,}"
What did you expect to happen:
We expect to see some value of the total number of violations after running the command kubectl get constraints
But we did not see anything like this
akshaysha@dev56-uswest1adevc:~$ kubectl-infrastage get constraints
NAME ENFORCEMENT-ACTION TOTAL-VIOLATIONS
k8senforcepaastacontract.constraints.gatekeeper.sh/pods-follow-paasta-contract warn
Anything else you would like to add: [Miscellaneous information that will assist in solving the issue.]
Looking at audit logs for gatekeeper we see that the audits are happening but I assume they are not getting reported somehow
{"level":"info","ts":1664218149.2943795,"logger":"controller","msg":"Missing labels for resource=monk--relay-darklaunch-0 of type=Pod. Owning team=compute_infra_platform_experience needs to add them.","process":"audit","audit_id":"2022-09-26T18:49:02Z","details":{"missing_labels":["paasta.yelp.com/cluster"],"owning_team":"compute_infra_platform_experience"},"event_type":"violation_audited","constraint_group":"constraints.gatekeeper.sh","constraint_api_version":"v1beta1","constraint_kind":"K8sEnforcePaastaContract","constraint_name":"pods-follow-paasta-contract","constraint_namespace":"","constraint_action":"warn","resource_group":"","resource_api_version":"v1","resource_kind":"Pod","resource_namespace":"paasta","resource_name":"monk--relay-darklaunch-0"}
{"level":"info","ts":1664218149.3017128,"logger":"controller","msg":"Missing labels for resource=nrtsearch-operator-main-799785bb64-mh2nk of type=Pod. Owning team=compute_infra_platform_experience needs to add them.","process":"audit","audit_id":"2022-09-26T18:49:02Z","details":{"missing_labels":["paasta.yelp.com/cluster"],"owning_team":"compute_infra_platform_experience"},"event_type":"violation_audited","constraint_group":"constraints.gatekeeper.sh","constraint_api_version":"v1beta1","constraint_kind":"K8sEnforcePaastaContract","constraint_name":"pods-follow-paasta-contract","constraint_namespace":"","constraint_action":"warn","resource_group":"","resource_api_version":"v1","resource_kind":"Pod","resource_namespace":"paasta","resource_name":"nrtsearch-operator-main-799785bb64-mh2nk"}
{"level":"info","ts":1664218149.3085637,"logger":"controller","msg":"Missing labels for resource=scylla-operator-main-bb4db5658-4hw86 of type=Pod. Owning team=compute_infra_platform_experience needs to add them.","process":"audit","audit_id":"2022-09-26T18:49:02Z","details":{"missing_labels":["paasta.yelp.com/cluster"],"owning_team":"compute_infra_platform_experience"},"event_type":"violation_audited","constraint_group":"constraints.gatekeeper.sh","constraint_api_version":"v1beta1","constraint_kind":"K8sEnforcePaastaContract","constraint_name":"pods-follow-paasta-contract","constraint_namespace":"","constraint_action":"warn","resource_group":"","resource_api_version":"v1","resource_kind":"Pod","resource_namespace":"paasta","resource_name":"scylla-operator-main-bb4db5658-4hw86"}
{"level":"info","ts":1664218149.3152509,"logger":"controller","msg":"Missing labels for resource=tronlinks-main--stage-6bf54bb586-9kccw of type=Pod. Owning team=compute_infra_platform_experience needs to add them.","process":"audit","audit_id":"2022-09-26T18:49:02Z","details":{"missing_labels":["paasta.yelp.com/cluster"],"owning_team":"compute_infra_platform_experience"},"event_type":"violation_audited","constraint_group":"constraints.gatekeeper.sh","constraint_api_version":"v1beta1","constraint_kind":"K8sEnforcePaastaContract","constraint_name":"pods-follow-paasta-contract","constraint_namespace":"","constraint_action":"warn","resource_group":"","resource_api_version":"v1","resource_kind":"Pod","resource_namespace":"paasta","resource_name":"tronlinks-main--stage-6bf54bb586-9kccw"}
{"level":"info","ts":1664218149.3203297,"logger":"controller","msg":"Missing labels for resource=vitess-operator-main-68bb9877fc-5bgs9 of type=Pod. Owning team=compute_infra_platform_experience needs to add them.","process":"audit","audit_id":"2022-09-26T18:49:02Z","details":{"missing_labels":["paasta.yelp.com/cluster"],"owning_team":"compute_infra_platform_experience"},"event_type":"violation_audited","constraint_group":"constraints.gatekeeper.sh","constraint_api_version":"v1beta1","constraint_kind":"K8sEnforcePaastaContract","constraint_name":"pods-follow-paasta-contract","constraint_namespace":"","constraint_action":"warn","resource_group":"","resource_api_version":"v1","resource_kind":"Pod","resource_namespace":"paasta","resource_name":"vitess-operator-main-68bb9877fc-5bgs9"}
{"level":"info","ts":1664218149.322235,"logger":"controller","msg":"Resource=vitess-operator-main-75b4bffdb5-ps5z5 of type=Pod is missing the required labels={\"yelp.com/owner\"}","process":"audit","audit_id":"2022-09-26T18:49:02Z","details":{"missing_labels":["yelp.com/owner"]},"event_type":"violation_audited","constraint_group":"constraints.gatekeeper.sh","constraint_api_version":"v1beta1","constraint_kind":"K8sEnforceOwnerLabel","constraint_name":"pods-must-have-owner-label","constraint_namespace":"","constraint_action":"warn","resource_group":"","resource_api_version":"v1","resource_kind":"Pod","resource_namespace":"paasta","resource_name":"vitess-operator-main-75b4bffdb5-ps5z5"}
{"level":"info","ts":1664218149.3592486,"logger":"controller","msg":"Missing labels for resource=logspout-k8s-webhook-69f5bbcdcd-lc4tz of type=Pod. Owning team=compute_infra_core needs to add them.","process":"audit","audit_id":"2022-09-26T18:49:02Z","details":{"missing_labels":["paasta.yelp.com/cluster"],"owning_team":"compute_infra_core"},"event_type":"violation_audited","constraint_group":"constraints.gatekeeper.sh","constraint_api_version":"v1beta1","constraint_kind":"K8sEnforcePaastaContract","constraint_name":"pods-follow-paasta-contract","constraint_namespace":"","constraint_action":"warn","resource_group":"","resource_api_version":"v1","resource_kind":"Pod","resource_namespace":"webhooks","resource_name":"logspout-k8s-webhook-69f5bbcdcd-lc4tz"}
{"level":"info","ts":1664218149.5872676,"logger":"KubeAPIWarningLogger","msg":"v1 ComponentStatus is deprecated in v1.19+"}
{"level":"error","ts":1664218151.4940047,"logger":"controller","msg":"Unable to look up object namespace","process":"audit","audit_id":"2022-09-26T18:49:02Z","objNs":"paasta-elasticsearches","error":"namespaces \"paasta-elasticsearches\" not found","stacktrace":"github.com/open-policy-agent/gatekeeper/pkg/audit.(*Manager).auditResources\n\t/go/src/github.com/open-policy-agent/gatekeeper/pkg/audit/manager.go:433\ngithub.com/open-policy-agent/gatekeeper/pkg/audit.(*Manager).audit\n\t/go/src/github.com/open-policy-agent/gatekeeper/pkg/audit/manager.go:218\ngithub.com/open-policy-agent/gatekeeper/pkg/audit.(*Manager).auditManagerLoop\n\t/go/src/github.com/open-policy-agent/gatekeeper/pkg/audit/manager.go:605"}
{"level":"info","ts":1664218151.529573,"logger":"KubeAPIWarningLogger","msg":"policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+"}
{"level":"error","ts":1664218092.1792336,"logger":"controller","msg":"audit manager audit() failed","process":"audit","error":"","errorVerbose":"\ngithub.com/open-policy-agent/gatekeeper/pkg/audit.mergeErrors\n\t/go/src/github.com/open-policy-agent/gatekeeper/pkg/audit/manager.go:1009\ngithub.com/open-policy-agent/gatekeeper/pkg/audit.(*Manager).auditResources\n\t/go/src/github.com/open-policy-agent/gatekeeper/pkg/audit/manager.go:442\ngithub.com/open-policy-agent/gatekeeper/pkg/audit.(*Manager).audit\n\t/go/src/github.com/open-policy-agent/gatekeeper/pkg/audit/manager.go:218\ngithub.com/open-policy-agent/gatekeeper/pkg/audit.(*Manager).auditManagerLoop\n\t/go/src/github.com/open-policy-agent/gatekeeper/pkg/audit/manager.go:605\nruntime.goexit\n\t/usr/local/go/src/runtime/asm_amd64.s:1571","stacktrace":"github.com/open-policy-agent/gatekeeper/pkg/audit.(*Manager).auditManagerLoop\n\t/go/src/github.com/open-policy-agent/gatekeeper/pkg/audit/manager.go:606"}
Important thing to note The logs end with this logs
{"level":"error","ts":1664218092.1792336,"logger":"controller","msg":"audit manager audit() failed"}
Stack trace being
"\ngithub.com/open-policy-agent/gatekeeper/pkg/audit.mergeErrors
go/src/github.com/open-policy-agent/gatekeeper/pkg/audit/manager.go:1009
github.com/open-policy-agent/gatekeeper/pkg/audit.(*Manager).auditResources
/go/src/github.com/open-policy-agent/gatekeeper/pkg/audit/manager.go:442\ngithub.com/open-policy-agent/gatekeeper/pkg/audit.(*Manager).audit
/go/src/github.com/open-policy-agent/gatekeeper/pkg/audit/manager.go:218
github.com/open-policy-agent/gatekeeper/pkg/audit.(*Manager).auditManagerLoop\n\t/go/src/github.com/open-policy-agent/gatekeeper/pkg/audit/manager.go:605
runtime.goexit\n\t/usr/local/go/src/runtime/asm_amd64.s:1571","stacktrace":"github.com/open-policy-agent/gatekeeper/pkg/audit.(*Manager).auditManagerLoop\n\t/go/src/github.com/open-policy-agent/gatekeeper/pkg/audit/manager.go:606
Environment:
- Gatekeeper version: 3.9.0
- Kubernetes version: (use
kubectl version): v1.21.14
This was caused by a CRD with a custom finalizer that blocked deletion. The namespace of the object had already been deleted so gatekeeper correctly failed to look it the namespace.
However, I feel like it's undesired behaviour that this blocked the complete audit loop and no violations on any resource could be reported.
This was caused by a CRD with a custom finalizer that blocked deletion. The namespace of the object had already been deleted so gatekeeper correctly failed to look it the namespace.
Can you share the CRD with the finalizer to help us repro the issue?
I think this was the crd, this one elasticsearch.k8s.elastic.co/v1beta1
NODES VERSION
paasta-elasticsearches elasticsearch-elasticlucy720-k8s
I believe there is already an issue here, but I think one hanging crd should not block the whole loop.
This was caused by a CRD with a custom finalizer that blocked deletion. The namespace of the object had already been deleted so gatekeeper correctly failed to look it the namespace.
Can you share the CRD with the finalizer to help us repro the issue?
found some similar behavior on this issue: https://github.com/open-policy-agent/gatekeeper/issues/2307 audit has issues with reporting violations
This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 14 days if no further activity occurs. Thank you for your contributions.
@gmdfalk @akshaysharma096 @btwseeu78 This issue should have been fixed by https://github.com/open-policy-agent/gatekeeper/pull/2162 Can you please verify if your issue is fixed after upgrading to v3.10+? Please also indicate if you are using --audit-from-cache=true.
when you guys planning for a stable release of 3.11 .its holiday period i just don't want to introduce anything new. so you recommending to test the 3.11 beta
We plan to release 3.11 next week. In the meantime, you can try any release after v3.10 since https://github.com/open-policy-agent/gatekeeper/pull/2162 was merged before v3.10. https://github.com/open-policy-agent/gatekeeper/releases
Closing this issue as it has been addressed in v3.11. Feel free to reopen if not.