manifests
manifests copied to clipboard
Draft: Add an unofficial and unsupported Kluctl based deployment/distribution to /contrib
This adds a Kluctl based deployment to the contrib folder. It reuses and references the original Kustomize manifests as much as possible, while introducing kluctl specific overlays when needed to add templating to those.
I had a short discussion with @juliusvonkohout about this being part of manifests/contrib and we agreed that I should create a PoC and bring this to discussion. My plan is to also present this in the community and manifests meetings.
I'd suggest to read the README.md of this PR for more details about motivation and reasoning. Please note that this README.md is originally from my out-of-tree version here (this repo is not outdated as I moved all work into manifests/contrib), which means it might be worded in a way that doesn't sound like it's part of manifests/contrib...I will change wording in the future.
This is all still WIP and really just at the point where discussion can begin. Long-term, I believe that this is a viable form of a distribution that could be offered alongside plain/classical Kustomize deployments. I understand that the Kustomize based deployment is not meant for end-users and I hope that the Kluctl based version (which builds on top of the Kustomize manifests) can fill this gap.
I have also copied two e2e tests (pipeline_kind_kluctl_test.yaml and pipeline_m2m_kind_kluctl_test.yaml) to also showcase how this can be used in testing. @diegolovison This is what I mentioned in Slack a few days ago.
[APPROVALNOTIFIER] This PR is NOT APPROVED
This pull-request has been approved by: codablock Once this PR has been reviewed and has the lgtm label, please assign juliusvonkohout for approval. For more information see the Kubernetes Code Review Process.
The full list of commands accepted by this bot can be found here.
Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment
Important, if you're reviewing this before the release of v2.24.0 of Kluctl, you will need to download the devel release instead of following the installation instructions.
From the README.md
leading to a lot of complicated and partially manual steps required to install
I think the opposite, you just need to have docker, kind and run while ! kustomize build example | kubectl apply -f -; do echo "Retrying to apply resources"; sleep 10; done
This project might even turn out to be a viable distribution of Kubeflow
I was going to say that this PR looks like a distribution of Kubeflow but I saw the same phrase in the README.md.
Maybe this PR should be a new repository and add your distribution at https://www.kubeflow.org/docs/started/installing-kubeflow/#packaged-distributions-of-kubeflow ?
I saw that your PR contains also potential fixes.. What do you think about having it in a different PR? In this case, while the Kluctl discussion is happening the fixes can be applied in the project.
I think the opposite, you just need to have
docker,kindand runwhile ! kustomize build example | kubectl apply -f -; do echo "Retrying to apply resources"; sleep 10; done
@diegolovison My statement also included things like choosing which kustomize deployments/overlays to enable and which not, basically building your own kustomization.yaml from the example. The while+kustomize loop is another topic that I assume has much potential for discussion on how optimal it is :)
I was going to say that this PR looks like a distribution of Kubeflow but I saw the same phrase in the README.md.
Yes, I'd like to make this part of the discussion. I'm fine with both approaches and already have a good setup in my head how I'd integrate the upstream manifests in case I'd go with a dedicated repo.
If this becomes part of manifests/contrib, it will somewhat become an official distribution or at least endorsed by the Kubeflow maintainers. If it moves into its own distribution, it would "just" be another distribution. In that case, I'd also be happy to see some official support (e.g by being listed in the distro list) as I feel that this type of distribution fits well into the overall project.
I saw that your PR contains also potential fixes.. What do you think about having it in a different PR? In this case, while the Kluctl discussion is happening the fixes can be applied in the project.
Yes, this was the plan. I will do this in the next days.
/contrib is no official endorsement. It is incubating stuff.
@codablock i am rerunning the tests now. Please check the second and third attempts.
We also need to confine everything to /contrib for now.
I've rebased the PR on master and force-pushed.
I extracted a few fixes of this PR into the upstream repos:
- https://github.com/kubeflow/pipelines/pull/10669. I added sortOptions: legacy into the kluctl deployment for now until this fix is synced back into manifests.
- https://github.com/kubeflow/manifests/pull/2668
- https://github.com/kubeflow/manifests/pull/2669
- https://github.com/kubeflow/manifests/pull/2670
I'll rebase this PR again after these got merged.
Besides from the above extraced PRs, this PR does not touch anything outside of contrib/kluctl anymore. I refactored to have all its overlays inside contrib from now on. One thing unfortunately has to stay in apps: The .templateignore to force kluctl to never try to template upstream manifests.
I've rebased on master and force-pushed. For some reason my kluctl based e2e workflows are not being run.
Are you sure that the paths you list are changed in this PR?
cc @kubeflow/kubeflow-steering-committee for input on having multiple also unofficial/unsupported deployment options.
@codablock can you rebase to master? especially the github workflow changes seem outdated. Also some patches might be outdated against the 1.9 changes, for example the menu links.
This pull request has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
@codablock I am closing this due to inactivity
@juliusvonkohout Sorry about that, I definitely plan to revive this topic in the future but from a different perspective.
Then just reopen as needed :-)