compliantkubernetes-apps icon indicating copy to clipboard operation
compliantkubernetes-apps copied to clipboard

tests: Add releases unit tests

Open aarnq opened this issue 10 months ago • 3 comments

[!warning] This is a public repository, ensure not to disclose:

  • [x] personal data beyond what is necessary for interacting with this pull request, nor
  • [x] business confidential information, such as customer names.

What kind of PR is this?

Required: Mark one of the following that is applicable:

  • [ ] kind/feature
  • [x] kind/improvement
  • [ ] kind/deprecation
  • [ ] kind/documentation
  • [ ] kind/clean-up
  • [ ] kind/bug
  • [ ] kind/other

Optional: Mark one or more of the following that are applicable:

[!important] Breaking changes should be marked kind/admin-change or kind/dev-change depending on type Critical security fixes should be marked with kind/security

  • [ ] kind/admin-change
  • [ ] kind/dev-change
  • [ ] kind/security
  • [ ] kind/adr

What does this PR do / why do we need this PR?

This adds test coverage to the needs set on releases, and this should catch errors such as these

Information to reviewers

This takes ages to run since it has to render every release that it can, though I've tried to optimise it so that it only ever have to render every release once and then pull it from cache.

Checklist

  • [x] Proper commit message prefix on all commits
  • Change checks:
    • [x] The change is transparent
    • [ ] The change is disruptive
    • [x] The change requires no migration steps
    • [ ] The change requires migration steps
    • [ ] The change upgrades CRDs
  • Metrics checks:
    • [ ] The metrics are still exposed and present in Grafana after the change
    • [ ] The metrics names didn't change (Grafana dashboards and Prometheus alerts are not affected)
    • [ ] The metrics names did change (Grafana dashboards and Prometheus alerts were fixed)
  • Logs checks:
    • [ ] The logs do not show any errors after the change
  • Pod Security Policy checks:
    • [ ] Any changed pod is covered by Pod Security Admission
    • [ ] Any changed pod is covered by Gatekeeper Pod Security Policies
    • [ ] The change does not cause any pods to be blocked by Pod Security Admission or Policies
  • Network Policy checks:
    • [ ] Any changed pod is covered by Network Policies
    • [ ] The change does not cause any dropped packets in the NetworkPolicy Dashboard
  • Audit checks:
    • [ ] The change does not cause any unnecessary Kubernetes audit events
    • [ ] The change requires changes to Kubernetes audit policy
  • Falco checks:
    • [ ] The change does not cause any alerts to be generated by Falco
  • Bug checks:
    • [ ] The bug fix is covered by regression tests
  • Config checks:
    • [ ] The schema was updated

aarnq avatar Apr 09 '24 13:04 aarnq

This is impressive work!

Thank you!

As far as I can tell it is doing what it is supposed to but some parts of the code was pretty hard to fully comprehend. I wonder if another language could make these kinds of tests be more readable, compared to long lines of yq. We don't want to trap ourselves in code that only one person will be able to fix in the future! 😄

Probably, I think we can reevaluate what language we have our tests in when we've migrated to Golang for the bin scripts. I think it will be more natural to make that switch then as well.

I'll try to restructure and comment it a bit so it might become easier to understand. :sweat_smile:

aarnq avatar Apr 16 '24 11:04 aarnq

Question to reviewers, do we think it would be better to test this per infrastructure provider or not?

Also what do think of this method of making templated tests? It is much faster, and it would allow one to inline the templating where one would need it making it easier to understand, and crucially in this PR split it into multiple files.

aarnq avatar Apr 22 '24 14:04 aarnq

Question to reviewers, do we think it would be better to test this per infrastructure provider or not?

In an ideal world we would run it for every "config variation", right? Doing it per infrastructure provider is at least one step closer to that so I vote yes.

Also what do think of this method of making templated tests? It is much faster, and it would allow one to inline the templating where one would need it making it easier to understand, and crucially in this PR split it into multiple files.

I like it! But I would also like that the generated files were commited and have a pre-commit hook that makes sure that there is no diff uncommited after generating them.

simonklb avatar Apr 23 '24 08:04 simonklb

Question to reviewers, do we think it would be better to test this per infrastructure provider or not?

In an ideal world we would run it for every "config variation", right? Doing it per infrastructure provider is at least one step closer to that so I vote yes.

Also what do think of this method of making templated tests? It is much faster, and it would allow one to inline the templating where one would need it making it easier to understand, and crucially in this PR split it into multiple files.

I like it! But I would also like that the generated files were commited and have a pre-commit hook that makes sure that there is no diff uncommited after generating them.

Right, yeah I don't think we should do this per config variation right now given how slow they are. :sweat_smile:

But I can look into the changes to templated tests. I think the main thing is that we need to reduce the amount of lines needed to be generated, so it might need to be split up differently. Like the way it is now with this new releases tests it would generate ~270 lines per infra provider, when it is just two lines that are different between those files.

aarnq avatar Apr 24 '24 07:04 aarnq

Migrated to the new test generation method, and integrated these tests into the validate suite so it can take advantage of its cache as best as possible.

aarnq avatar Apr 26 '24 12:04 aarnq

Migrated to the new test generation method, and integrated these tests into the validate suite so it can take advantage of its cache as best as possible.

Caching in tests makes me worry. :smile: But I get why it's being done here.

To be fair, caching in general creates nightmares... :smile:

simonklb avatar Apr 26 '24 13:04 simonklb