Issues icon indicating copy to clipboard operation
Issues copied to clipboard

Git Sourced YAML in a Deploy raw Kubernetes YAML step does not add Octopus deployment labels to the deployment YAML.

Open Clare-Octopus opened this issue 1 year ago • 0 comments

Severity

Not blocking customers from deploying but not sure what those labels are tied to.

Version

2023.4.10989

Latest Version

I could reproduce the problem in the latest build

What happened?

When deploying to Kubernetes using a Deploy raw Kubernetes YAML step and having the YAML sourced from Inline YAML Octopus will put certain labels on the deployment and populate them with the deployment information ie:

Octopus.Action.Id: deploy-kubernetes-yaml
Octopus.Deployment.Id: deployments-92852
Octopus.Deployment.Tenant.Id: untenanted

Below is an example of a deployment using inline YAML as its source and you can see in both Rancher and the YAML for the deployment the labels are added and the deployment information included:

image

However, when deploying using a Deploy raw Kubernetes YAML step and having the YAML sourced from Git the labels are not added:

image

We should use those labels to identify the deployment and sometimes for the Kubernetes Object Status check in the deployment tasks menu in the UI. Some users have mentioned they use the labels to cleanup Kubernetes resources linked to the specific project but not to the current deployment, so they keep a clean cluster and have more resources available. This is negatively affecting some customers.

Reproduction

  1. Have a Kubernetes Cluster and a Kubernetes Deployment target (API or Agent) which has a successful health check to your Octopus instance.
  2. Create a Project in Octopus and add two Deploy raw Kubernetes YAML steps. Set one step up to use inline YAML and put some YAML in there for a deployment. Set the second step up to use YAML sourced from Git. Ensure you differentiate the two YAML deployments either by calling them different deployment metadata names in the YAML itself or by deploying the two steps to two different namespaces in your Kubernetes Cluster.
  3. Create a release and deploy that project.
  4. Check the Kubernetes Cluster deployment for the inline YAML, see it have Octopus specific labels.
  5. Check the Kubernetes Cluster deployment for YAML sourced from GIT and see no Octopus specific labels have been added.

Error and Stacktrace

Inline YAML - 

  labels:
    Octopus.Action.Id: deploy-kubernetes-yaml
    Octopus.Deployment.Id: deployments-92852
    Octopus.Deployment.Tenant.Id: untenanted
    Octopus.Environment.Id: environments-1801
    Octopus.Kubernetes.SelectionStrategyVersion: SelectionStrategyVersion2
    Octopus.Project.Id: projects-5841
    Octopus.RunbookRun.Id: ''
    Octopus.Step.Id: deploy-kubernetes-yaml
    app: web

YAML Sourced from Git -

  labels:
    app: web
  managedFields:

More Information

Initial Ticket (Internal) - https://octopus.zendesk.com/agent/tickets/200206 R and D (Internal) - https://octopusdeploy.slack.com/archives/CNHBHV2BX/p1724342593666409 Reproduction (internal) - https://octopus-operations.octopus.app/app#/Spaces-422/projects/helm-test/branches/refs%2Fheads%2Fmain/deployments/process You wont be able to see the YAML on the deployment for the reproduction but it will show you the project setup.

Workaround

One user mentioned they have implemented a workaround of setting the labels based on octopus variables and using them that way. That customer mentioned they use the labels to cleanup Kubernetes resources linked to the specific project but not to the current deployment, so they keep a clean cluster and have more resources available. So this is negatively affecting some customers.

Clare-Octopus avatar Aug 22 '24 15:08 Clare-Octopus