tofu-controller
tofu-controller copied to clipboard
Destroy failing because plan is too big for a Secret object
I have a terraform object that has been deleted but is failing to delete. It has errors because it failed but it should still do destroy and be deleted.
Using v0.14.0-rc5 runner and controller
paulc:weaveworks-20276 paulc [sandbox]$ kubectl logs -f beta-test-cluster-beta-test-c2-config-tf-runner
2023/02/23 17:29:59 Starting the runner... version sha
I0223 17:30:00.529027 7 request.go:682] Waited for 1.022524085s due to client-side throttling, not priority and fairness, request: GET:https://172.20.0.1:443/apis/kustomize.toolkit.fluxcd.io/v1beta2?timeout=32s
{"level":"info","ts":"2023-02-23T17:30:11.999Z","logger":"runner.terraform","msg":"preparing for Upload and Extraction","instance-id":""}
{"level":"info","ts":"2023-02-23T17:30:12.078Z","logger":"runner.terraform","msg":"write backend config","instance-id":"","path":"/tmp/default-beta-test-cluster-beta-test-c2-config/cluster-templates/cluster-config","config":"backend_override.tf"}
{"level":"info","ts":"2023-02-23T17:30:12.078Z","logger":"runner.terraform","msg":"write config to file","instance-id":"","filePath":"/tmp/default-beta-test-cluster-beta-test-c2-config/cluster-templates/cluster-config/backend_override.tf"}
{"level":"info","ts":"2023-02-23T17:30:12.079Z","logger":"runner.terraform","msg":"looking for path","instance-id":"","file":"terraform"}
{"level":"info","ts":"2023-02-23T17:30:12.082Z","logger":"runner.terraform","msg":"creating new terraform","instance-id":"e6f97a20-a04b-4428-9f27-cb1f1e9989d1","workingDir":"/tmp/default-beta-test-cluster-beta-test-c2-config/cluster-templates/cluster-config","execPath":"/usr/local/bin/terraform"}
{"level":"info","ts":"2023-02-23T17:30:12.113Z","logger":"runner.terraform","msg":"setting envvars","instance-id":"e6f97a20-a04b-4428-9f27-cb1f1e9989d1"}
{"level":"info","ts":"2023-02-23T17:30:12.113Z","logger":"runner.terraform","msg":"getting envvars from os environments","instance-id":"e6f97a20-a04b-4428-9f27-cb1f1e9989d1"}
{"level":"info","ts":"2023-02-23T17:30:12.119Z","logger":"runner.terraform","msg":"setting up the input variables","instance-id":"e6f97a20-a04b-4428-9f27-cb1f1e9989d1"}
{"level":"info","ts":"2023-02-23T17:30:12.119Z","logger":"runner.terraform","msg":"mapping the Spec.Values","instance-id":"e6f97a20-a04b-4428-9f27-cb1f1e9989d1"}
{"level":"info","ts":"2023-02-23T17:30:12.119Z","logger":"runner.terraform","msg":"mapping the Spec.Vars","instance-id":"e6f97a20-a04b-4428-9f27-cb1f1e9989d1"}
{"level":"info","ts":"2023-02-23T17:30:12.119Z","logger":"runner.terraform","msg":"mapping the Spec.VarsFrom","instance-id":"e6f97a20-a04b-4428-9f27-cb1f1e9989d1"}
{"level":"info","ts":"2023-02-23T17:30:12.159Z","logger":"runner.terraform","msg":"generating the template founds"}
{"level":"info","ts":"2023-02-23T17:30:12.159Z","logger":"runner.terraform","msg":"main.tf.tpl not found, skipping"}
{"level":"info","ts":"2023-02-23T17:30:12.171Z","logger":"runner.terraform","msg":"initializing","instance-id":"e6f97a20-a04b-4428-9f27-cb1f1e9989d1"}
{"level":"info","ts":"2023-02-23T17:30:12.171Z","logger":"runner.terraform","msg":"mapping the Spec.BackendConfigsFrom","instance-id":"e6f97a20-a04b-4428-9f27-cb1f1e9989d1"}
{
"terraform_version": "1.3.7",
"platform": "linux_amd64",
"provider_selections": {},
"terraform_outdated": true
}
Upgrading modules...
- aws_auth in ../../modules/aws-auth
- aws_ebs_csi_driver in ../../modules/aws-ebs-csi-driver
- aws_efs_csi_driver in ../../modules/aws-efs-csi-driver
- aws_load_balancer_controller in ../../modules/aws-load-balancer-controller
- cluster_autoscaler in ../../modules/cluster-autoscaler
- external_dns in ../../modules/external-dns
- flux_bootstrap in ../../modules/flux-bootstrap
- leaf_config in ../../modules/leaf-config
- matrixx_cnf_multus_node_group in ../../modules/node-group
- matrixx_proc_node_group in ../../modules/node-group
- matrixx_pub_ckpt_node_group in ../../modules/node-group
- matrixx_sba_node_group in ../../modules/node-group
- matrixx_shared_node_group in ../../modules/node-group
- oracle_node_group in ../../modules/node-group
- system_node_group in ../../modules/node-group
- tf_controller in ../../modules/tf-controller
- vault_k8s_auth in ../../modules/vault-k8s-auth
- worker_node_group in ../../modules/node-group
Initializing the backend...
Successfully configured the backend "s3"! Terraform will automatically
use this backend unless the backend configuration changes.
Initializing provider plugins...
- terraform.io/builtin/terraform is built in to Terraform
- Finding hashicorp/kubernetes versions matching "~> 2.14, ~> 2.16"...
- Finding hashicorp/external versions matching "~> 2.2.2"...
- Finding hashicorp/null versions matching ">= 3.1.1"...
- Finding hashicorp/vault versions matching "~> 3.11"...
- Finding hashicorp/aws versions matching "~> 4.33"...
- Finding gitlabhq/gitlab versions matching "~> 15.7"...
- Finding fluxcd/flux versions matching ">= 0.20.0"...
- Finding hashicorp/tls versions matching "~> 4.0"...
- Finding gavinbunney/kubectl versions matching "~> 1.14"...
- Finding hashicorp/random versions matching ">= 3.4.3, ~> 3.4.3"...
- Finding hashicorp/http versions matching "~> 3.2"...
- Installing gavinbunney/kubectl v1.14.0...
- Installed gavinbunney/kubectl v1.14.0 (self-signed, key ID AD64217B5ADD572F)
- Installing hashicorp/kubernetes v2.18.1...
- Installed hashicorp/kubernetes v2.18.1 (signed by HashiCorp)
- Installing hashicorp/null v3.2.1...
- Installed hashicorp/null v3.2.1 (signed by HashiCorp)
- Installing hashicorp/aws v4.55.0...
- Installed hashicorp/aws v4.55.0 (signed by HashiCorp)
- Installing hashicorp/tls v4.0.4...
- Installed hashicorp/tls v4.0.4 (signed by HashiCorp)
- Installing hashicorp/random v3.4.3...
- Installed hashicorp/random v3.4.3 (signed by HashiCorp)
- Installing hashicorp/http v3.2.1...
- Installed hashicorp/http v3.2.1 (signed by HashiCorp)
- Installing hashicorp/external v2.2.3...
- Installed hashicorp/external v2.2.3 (signed by HashiCorp)
- Installing hashicorp/vault v3.13.0...
- Installed hashicorp/vault v3.13.0 (signed by HashiCorp)
- Installing gitlabhq/gitlab v15.9.0...
- Installed gitlabhq/gitlab v15.9.0 (self-signed, key ID 0D47B7AB85F63F65)
- Installing fluxcd/flux v0.24.1...
- Installed fluxcd/flux v0.24.1 (self-signed, key ID D5D3316A880BB5B9)
Partner and community providers are signed by their developers.
If you'd like to know more about provider signing, you can read about it here:
https://www.terraform.io/docs/cli/plugins/signing.html
Terraform has created a lock file .terraform.lock.hcl to record the provider
selections it made above. Include this file in your version control repository
so that Terraform can guarantee to make the same selections by default when
you run "terraform init" in the future.
Terraform has been successfully initialized!
{"level":"info","ts":"2023-02-23T17:30:35.228Z","logger":"runner.terraform","msg":"workspace select"}
{"level":"info","ts":"2023-02-23T17:30:35.244Z","logger":"runner.terraform","msg":"creating a plan","instance-id":"e6f97a20-a04b-4428-9f27-cb1f1e9989d1"}
No changes. No objects need to be destroyed.
Either you have not created any objects yet or the existing objects were
already deleted outside of Terraform.
{"level":"info","ts":"2023-02-23T17:31:10.306Z","logger":"runner.terraform","msg":"save the plan","instance-id":"e6f97a20-a04b-4428-9f27-cb1f1e9989d1"}
{"level":"info","ts":"2023-02-23T17:31:10.451Z","logger":"runner.terraform","msg":"loading plan from secret","instance-id":"e6f97a20-a04b-4428-9f27-cb1f1e9989d1"}
{"level":"error","ts":"2023-02-23T17:31:10.471Z","logger":"runner.terraform","msg":"plan name mismatch","instance-id":"e6f97a20-a04b-4428-9f27-cb1f1e9989d1","error":"error pending plan and plan's name in the secret are not matched: != plan-beta-template-d30ae58c70b262c17eaa0661c1a96bd6d4904e33"}
{"level":"info","ts":"2023-02-23T17:31:10.483Z","logger":"runner.terraform","msg":"cleanup TmpDir","instance-id":"e6f97a20-a04b-4428-9f27-cb1f1e9989d1","tmpDir":"/tmp/default-beta-test-cluster-beta-test-c2-config"}
paulc:weaveworks-20276 paulc [sandbox]$ kubectl get terraforms.infra.contrib.fluxcd.io
NAME READY STATUS AGE
beta-test-cluster-beta-test-c1-config Unknown Applying 4h21m
beta-test-cluster-beta-test-c2-config Unknown Applying 4h20m
beta-test-cluster-c1-config Unknown Terraform Planning 4h5m
beta-test-cluster-c2-config Unknown Terraform Planning 4h4m
beta-test-vpc-v1 Unknown Applying 5h43m
dish-beta-cluster-c1-config Unknown Initializing 3h26m
dish-beta-cluster-c1-eks True No drift: beta-template/d30ae58c70b262c17eaa0661c1a96bd6d4904e33 3h36m
dish-beta-cluster-c2-config False error saving plan secret: rpc error: code = Unknown desc = error recording plan status: Secret "tfplan-default-dish-beta-cluster-c2-config" is invalid: data: Too long: must have at most 1048576 bytes 3h23m
dish-beta-cluster-c2-eks True No drift: beta-template/d30ae58c70b262c17eaa0661c1a96bd6d4904e33 3h36m
dish-beta-cnfs-c1-oracle True No drift: beta-template/d30ae58c70b262c17eaa0661c1a96bd6d4904e33 3h19m
dish-beta-cnfs-c2-matrixx True No drift: beta-template/d30ae58c70b262c17eaa0661c1a96bd6d4904e33 3h18m
dish-beta-vpc-v1 True Outputs written: beta-template/d30ae58c70b262c17eaa0661c1a96bd6d4904e33 3h39m
paulc:weaveworks-20276 paulc [sandbox]$ kubectl get terraforms.infra.contrib.fluxcd.io beta-test-cluster-c1-config -o yaml
apiVersion: infra.contrib.fluxcd.io/v1alpha1
kind: Terraform
metadata:
creationTimestamp: "2023-02-23T13:26:11Z"
deletionGracePeriodSeconds: 0
deletionTimestamp: "2023-02-23T13:56:12Z"
finalizers:
- finalizers.tf.contrib.fluxcd.io
generation: 3
labels:
kustomize.toolkit.fluxcd.io/name: beta-test-cluster-c1-config
kustomize.toolkit.fluxcd.io/namespace: default
name: beta-test-cluster-c1-config
namespace: default
resourceVersion: "4508831"
uid: f2e6c378-93d0-437a-bef0-8266c8d789f0
spec:
alwaysCleanupRunnerPod: true
approvePlan: auto
backendConfig:
customConfiguration: |
backend "s3" {
key = "ww-sandbox-wge/default/beta-test/config/c1/terraform.tfstate"
bucket = "ww-20276-749339757188-us-east-1-tf-state"
region = "us-east-1"
encrypt = true
dynamodb_table = "ww-20276-749339757188-us-east-1-tf-state"
}
destroyResourcesOnDeletion: true
disableDriftDetection: false
force: false
interval: 10m
parallelism: 0
path: ./cluster-templates/cluster-config
refreshBeforeApply: false
retryInterval: 20s
runnerTerminationGracePeriodSeconds: 30
serviceAccountName: tf-runner
sourceRef:
kind: GitRepository
name: terraform
namespace: flux-system
storeReadablePlan: none
vars:
- name: cluster_name
value: beta-test-c1
- name: region
value: us-east-1
- name: target_path
value: clusters/default/beta-test-c1
- name: desired_size
value: "2"
- name: eks_core_state_bucket
value: ww-20276-749339757188-us-east-1-tf-state
- name: eks_core_state_key
value: ww-sandbox-wge/default/beta-test/eks/c1/terraform.tfstate
varsFrom:
- kind: ConfigMap
name: leaf-cluster-config
- kind: Secret
name: leaf-cluster-auth
- kind: ConfigMap
name: tf-output-values
varsKeys:
- gitlab_known_hosts
- harbor_registry
- vault_url
- wge_profiles_url
- cluster_admin_roles_string
- cluster_admin_users_string
workspace: default
writeOutputsToSecret:
name: beta-test-cluster-c1-config
status:
conditions:
- lastTransitionTime: "2023-02-23T13:56:38Z"
message: Terraform Planning
reason: Progressing
status: Unknown
type: Ready
- lastTransitionTime: "2023-02-23T13:37:21Z"
message: Plan generated
reason: TerraformPlannedWithChanges
status: "True"
type: Plan
- lastTransitionTime: "2023-02-23T13:55:45Z"
message: "error running Apply: rpc error: code = Internal desc = exit status 1\n\nError:
creating Route 53 Record: InvalidChangeBatch: [Tried to create resource record
set [name='beta-test-c1.ww-sandbox-wge.sandbox.weave.works.', type='NS'] but
it already exists]\n\tstatus code: 400, request id: 3afefa9d-bdb2-49d6-8f01-6f2008734224\n\n
\ with aws_route53_record.sub_ns,\n on main.tf line 151, in resource \"aws_route53_record\"
\"sub_ns\":\n 151: resource \"aws_route53_record\" \"sub_ns\" {\n\n\nError:
creating IAM Role (beta-test-c1-aws-ebs-csi-driver): EntityAlreadyExists: Role
with name beta-test-c1-aws-ebs-csi-driver already exists.\n\tstatus code: 409,
request id: 271def3e-6dad-4a1d-9535-5d48f95ec686\n\n with module.aws_ebs_csi_driver.aws_iam_role.aws_ebs_csi_driver,\n
\ on ../../modules/aws-ebs-csi-driver/main.tf line 19, in resource \"aws_iam_role\"
\"aws_ebs_csi_driver\":\n 19: resource \"aws_iam_role\" \"aws_ebs_csi_driver\"
{\n\n\nError: creating IAM Role (beta-test-c1-aws-efs-csi-driver): EntityAlreadyExists:
Role with name beta-test-c1-aws-efs-csi-driver already exists.\n\tstatus code:
409, request id: d9aeae3e-dc15-4bcb-8833-f3717c1a39d6\n\n with module.aws_efs_csi_driver.aws_iam_role.aws_efs_csi_driver,\n
\ on ../../modules/aws-efs-csi-driver/main.tf line 19, in resource \"aws_iam_role\"
\"aws_efs_csi_driver\":\n 19: resource \"aws_iam_role\" \"aws_efs_csi_driver\"
{\n\n\nError: creating IAM Policy beta-test-c1-aws-efs-csi-driver: EntityAlreadyExists:
A policy called beta-test-c1-aws-efs-csi-driver already exists. Duplicate names
are not allowed.\n\tstatus code: 409, request id: 72c524bd-ffc8-4330-b002-d7909a4867ee\n\n
\ with module.aws_efs_csi_driver.aws_iam_policy.aws_efs_csi_driver,\n on ../../modules/aws-efs-csi-driver/main.tf
line 71, in resource \"aws_iam_policy\" \"aws_efs_csi_driver\":\n 71: resource
\"aws_iam_policy\" \"aws_efs_csi_driver\" {\n\n\nError: creating IAM Role (beta-test-c1-aws-load-balancer-controller):
EntityAlreadyExists: Role with name beta-test-c1-aws-load-balancer-controller
already exists.\n\tstatus code: 409, request id: c9c396ab-98f6-4eab-bcef-94a1454f74c3\n\n
\ with module.aws_load_balancer_controller.aws_iam_role.aws_load_balancer_controller,\n
\ on ../../modules/aws-load-balancer-controller/main.tf line 22, in resource
\"aws_iam_role\" \"aws_load_balancer_controller\":\n 22: resource \"aws_iam_role\"
\"aws_load_balancer_controller\" {\n\n\nError: creating IAM Policy beta-test-c1-aws-load-balancer-controller:
EntityAlreadyExists: A policy called beta-test-c1-aws-load-balancer-controller
already exists. Duplicate names are not allowed.\n\tstatus code: 409, request
id: 116b4fd8-5c29-4559-afc1-7ff9437f3c19\n\n with module.aws_load_balancer_controller.aws_iam_policy.aws_load_balancer_controller,\n
\ on ../../modules/aws-load-balancer-controller/main.tf line 35, in resource
\"aws_iam_policy\" \"aws_load_balancer_controller\":\n 35: resource \"aws_iam_policy\"
\"aws_load_balancer_controller\" {\n\n\nError: creating IAM Role (beta-test-c1-cluster-autoscaler):
EntityAlreadyExists: Role with name beta-test-c1-cluster-autoscaler already
exists.\n\tstatus code: 409, request id: ba4703a4-3a8d-4219-8420-82fb39305ce0\n\n
\ with module.cluster_autoscaler.aws_iam_role.cluster_autoscaler,\n on ../../modules/cluster-autoscaler/main.tf
line 21, in resource \"aws_iam_role\" \"cluster_autoscaler\":\n 21: resource
\"aws_iam_role\" \"cluster_autoscaler\" {\n\n\nError: creating IAM Policy beta-test-c1-cluster-autoscaler:
EntityAlreadyExists: A policy called beta-test-c1-cluster-autoscaler already
exists. Duplicate names are not allowed.\n\tstatus code: 409, request id: 481312d8-7d0d-4ca9-bf14-a58bd5f2543c\n\n
\ with module.cluster_autoscaler.aws_iam_policy.cluster_autoscaler,\n on ../../modules/cluster-autoscaler/main.tf
line 61, in resource \"aws_iam_policy\" \"cluster_autoscaler\":\n 61: resource
\"aws_iam_policy\" \"cluster_autoscaler\" {\n\n\nError: creating IAM Role (beta-test-c1-external-dns):
EntityAlreadyExists: Role with name beta-test-c1-external-dns already exists.\n\tstatus
code: 409, request id: 9e1d60f5-8776-49c6-834f-667f7eeea0ae\n\n with module.external_dns.aws_iam_role.external_dns,\n
\ on ../../modules/external-dns/main.tf line 20, in resource \"aws_iam_role\"
\"external_dns\":\n 20: resource \"aws_iam_role\" \"external_dns\" {\n\n\nError:
creating IAM Policy beta-test-c1-external-dns: EntityAlreadyExists: A policy
called beta-test-c1-external-dns already exists. Duplicate names are not allowed.\n\tstatus
code: 409, request id: 81c7e296-39b9-4501-8a0d-3abdec86ad9d\n\n with module.external_dns.aws_iam_policy.external_dns,\n
\ on ../../modules/external-dns/main.tf line 48, in resource \"aws_iam_policy\"
\"external_dns\":\n 48: resource \"aws_iam_policy\" \"external_dns\" {\n\n\nError:
flux-system/helm-controller failed to run apply: error when retrieving current
configuration of:\nResource: \"apps/v1, Resource=deployments\", GroupVersionKind:
\"apps/v1, Kind=Deployment\"\nName: \"helm-controller\", Namespace: \"flux-system\"\nfrom
server for: \"/tmp/832693405kubectl_manifest.yaml\": Get \"https://8F04F89B4C462ED8537A0DE38AA65EFF.gr7.us-east-1.eks.amazonaws.com/apis/apps/v1/namespaces/flux-system/deployments/helm-controller\":
dial tcp: lookup 8F04F89B4C462ED8537A0DE38AA65EFF.gr7.us-east-1.eks.amazonaws.com
on 172.20.0.10:53: no such host\n\n with module.flux_bootstrap.kubectl_manifest.install[\"apps/v1/deployment/flux-system/helm-controller\"],\n
\ on ../../modules/flux-bootstrap/main.tf line 51, in resource \"kubectl_manifest\"
\"install\":\n 51: resource \"kubectl_manifest\" \"install\" {\n\n\nError:
flux-system/source-controller failed to run apply: error when retrieving current
configuration of:\nResource: \"apps/v1, Resource=deployments\", GroupVersionKind:
\"apps/v1, Kind=Deployment\"\nName: \"source-controller\", Namespace: \"flux-system\"\nfrom
server for: \"/tmp/119193587kubectl_manifest.yaml\": Get \"https://8F04F89B4C462ED8537A0DE38AA65EFF.gr7.us-east-1.eks.amazonaws.com/apis/apps/v1/namespaces/flux-system/deployments/source-controller\":
dial tcp: lookup 8F04F89B4C462ED8537A0DE38AA65EFF.gr7.us-east-1.eks.amazonaws.com
on 172.20.0.10:53: no such host\n\n with module.flux_bootstrap.kubectl_manifest.install[\"apps/v1/deployment/flux-system/source-controller\"],\n
\ on ../../modules/flux-bootstrap/main.tf line 51, in resource \"kubectl_manifest\"
\"install\":\n 51: resource \"kubectl_manifest\" \"install\" {\n\n\nError:
flux-system/notification-controller failed to run apply: error when retrieving
current configuration of:\nResource: \"apps/v1, Resource=deployments\", GroupVersionKind:
\"apps/v1, Kind=Deployment\"\nName: \"notification-controller\", Namespace:
\"flux-system\"\nfrom server for: \"/tmp/487905764kubectl_manifest.yaml\": Get
\"https://8F04F89B4C462ED8537A0DE38AA65EFF.gr7.us-east-1.eks.amazonaws.com/apis/apps/v1/namespaces/flux-system/deployments/notification-controller\":
dial tcp: lookup 8F04F89B4C462ED8537A0DE38AA65EFF.gr7.us-east-1.eks.amazonaws.com
on 172.20.0.10:53: no such host\n\n with module.flux_bootstrap.kubectl_manifest.install[\"apps/v1/deployment/flux-system/notification-controller\"],\n
\ on ../../modules/flux-bootstrap/main.tf line 51, in resource \"kubectl_manifest\"
\"install\":\n 51: resource \"kubectl_manifest\" \"install\" {\n\n\nError:
flux-system/kustomize-controller failed to run apply: error when retrieving
current configuration of:\nResource: \"apps/v1, Resource=deployments\", GroupVersionKind:
\"apps/v1, Kind=Deployment\"\nName: \"kustomize-controller\", Namespace: \"flux-system\"\nfrom
server for: \"/tmp/208568502kubectl_manifest.yaml\": Get \"https://8F04F89B4C462ED8537A0DE38AA65EFF.gr7.us-east-1.eks.amazonaws.com/apis/apps/v1/namespaces/flux-system/deployments/kustomize-controller\":
dial tcp: lookup 8F04F89B4C462ED8537A0DE38AA65EFF.gr7.us-east-1.eks.amazonaws.com
on 172.20.0.10:53: no such host\n\n with module.flux_bootstrap.kubectl_manifest.install[\"apps/v1/deployment/flux-system/kustomize-controller\"],\n
\ on ../../modules/flux-bootstrap/main.tf line 51, in resource \"kubectl_manifest\"
\"install\":\n 51: resource \"kubectl_manifest\" \"install\" {\n\n\nError:
POST https://gitlab.com/api/v4/projects/43643635/repository/files/clusters/management/secrets/leaf-clusters/beta-test-c1-kubeconfig.yaml:
400 {message: A file with this name already exists}\n\n with module.leaf_config.gitlab_repository_file.kubeconfig,\n
\ on ../../modules/leaf-config/main.tf line 101, in resource \"gitlab_repository_file\"
\"kubeconfig\":\n 101: resource \"gitlab_repository_file\" \"kubeconfig\" {\n\n\nError:
creating EC2 Launch Template: InvalidLaunchTemplateName.AlreadyExistsException:
Launch template name already in use.\n\tstatus code: 400, request id: 480760cd-7ff9-4c26-9fc2-b2f73eac2c06\n\n
\ with module.system_node_group.aws_launch_template.this,\n on ../../modules/node-group/main.tf
line 34, in resource \"aws_launch_template\" \"this\":\n 34: resource \"aws_launch_template\"
\"this\" {\n\n\nError: creating EC2 Launch Template: InvalidLaunchTemplateName.AlreadyExistsException:
Launch template name already in use.\n\tstatus code: 400, request id: ecaf1abd-276c-48af-b2d3-6c2266b243e0\n\n
\ with module.oracle_node_group.aws_launch_template.this,\n on ../../modules/node-group/main.tf
line 34, in resource \"aws_launch_template\" \"this\":\n 34: resource \"aws_launch_template\"
\"this\" {\n\n\nError: creating EC2 Launch Template: InvalidLaunchTemplateName.AlreadyExistsException:
Launch template name already in use.\n\tstatus code: 400, request id: 0174d520-feb2-431d-8808-905cf41a0a14\n\n
\ with module.matrixx_shared_node_group.aws_launch_template.this,\n on ../../modules/node-group/main.tf
line 34, in resource \"aws_launch_template\" \"this\":\n 34: resource \"aws_launch_template\"
\"this\" {\n\n\nError: creating EC2 Launch Template: InvalidLaunchTemplateName.AlreadyExistsException:
Launch template name already in use.\n\tstatus code: 400, request id: 6d65cd5d-79cc-410d-b08e-bfe0df290617\n\n
\ with module.worker_node_group.aws_launch_template.this,\n on ../../modules/node-group/main.tf
line 34, in resource \"aws_launch_template\" \"this\":\n 34: resource \"aws_launch_template\"
\"this\" {\n\n\nError: creating EC2 Launch Template: InvalidLaunchTemplateName.AlreadyExistsException:
Launch template name already in use.\n\tstatus code: 400, request id: b2cb8719-0c38-4dbb-8fe9-3c1d6758c33c\n\n
\ with module.matrixx_sba_node_group.aws_launch_template.this,\n on ../../modules/node-group/main.tf
line 34, in resource \"aws_launch_template\" \"this\":\n 34: resource \"aws_launch_template\"
\"this\" {\n\n\nError: creating EC2 Launch Template: InvalidLaunchTemplateName.AlreadyExistsException:
Launch template name already in use.\n\tstatus code: 400, request id: f1c07b80-6a92-4abf-a6ae-06a01c98fec2\n\n
\ with module.matrixx_cnf_multus_node_group.aws_launch_template.this,\n on
../../modules/node-group/main.tf line 34, in resource \"aws_launch_template\"
\"this\":\n 34: resource \"aws_launch_template\" \"this\" {\n\n\nError: creating
EC2 Launch Template: InvalidLaunchTemplateName.AlreadyExistsException: Launch
template name already in use.\n\tstatus code: 400, request id: 1a740339-d4dd-4554-a359-bcee36286c70\n\n
\ with module.matrixx_pub_ckpt_node_group.aws_launch_template.this,\n on ../../modules/node-group/main.tf
line 34, in resource \"aws_launch_template\" \"this\":\n 34: resource \"aws_launch_template\"
\"this\" {\n\n\nError: creating EC2 Launch Template: InvalidLaunchTemplateName.AlreadyExistsException:
Launch template name already in use.\n\tstatus code: 400, request id: 3c135140-956c-4e50-a747-4ff5120c86f6\n\n
\ with module.matrixx_proc_node_group.aws_launch_template.this,\n on ../../modules/node-group/main.tf
line 34, in resource \"aws_launch_template\" \"this\":\n 34: resource \"aws_launch_template\"
\"this\" {\n\n\nError: creating IAM Role (beta-test-c1-matrixx-shared-ng-role):
EntityAlreadyExists: Role with name beta-test-c1-matrixx-shared-ng-role already
exists.\n\tstatus code: 409, request id: 8e8b346c-1178-4324-884d-a9d67060cd9f\n\n
\ with module.matrixx_shared_node_group.aws_iam_role.eks_node_group_role,\n
\ on ../../modules/node-group/main.tf line 87, in resource \"aws_iam_role\"
\"eks_node_group_role\":\n 87: resource \"aws_iam_role\" \"eks_node_group_role\"
{\n\n\nError: creating IAM Role (beta-test-c1-matrixx-sba-ng-role): EntityAlreadyExists:
Role with name beta-test-c1-matrixx-sba-ng-role already exists.\n\tstatus code:
409, request id: 48573883-d625-4732-9b64-e221462bab8b\n\n with module.matrixx_sba_node_group.aws_iam_role.eks_node_group_role,\n
\ on ../../modules/node-group/main.tf line 87, in resource \"aws_iam_role\"
\"eks_node_group_role\":\n 87: resource \"aws_iam_role\" \"eks_node_group_role\"
{\n\n\nError: creating IAM Role (beta-test-c1-worker-ng-role): EntityAlreadyExists:
Role with name beta-test-c1-worker-ng-role already exists.\n\tstatus code: 409,
request id: 3c0f937b-5d83-4907-915d-e9e404470208\n\n with module.worker_node_group.aws_iam_role.eks_node_group_role,\n
\ on ../../modules/node-group/main.tf line 87, in resource \"aws_iam_role\"
\"eks_node_group_role\":\n 87: resource \"aws_iam_role\" \"eks_node_group_role\"
{\n\n\nError: creating IAM Role (beta-test-c1-oracle-ng-role): EntityAlreadyExists:
Role with name beta-test-c1-oracle-ng-role already exists.\n\tstatus code: 409,
request id: 2e17154a-2fbd-4d03-81a8-1aec6a935c27\n\n with module.oracle_node_group.aws_iam_role.eks_node_group_role,\n
\ on ../../modules/node-group/main.tf line 87, in resource \"aws_iam_role\"
\"eks_node_group_role\":\n 87: resource \"aws_iam_role\" \"eks_node_group_role\"
{\n\n\nError: creating IAM Role (beta-test-c1-matrixx-proc-ng-role): EntityAlreadyExists:
Role with name beta-test-c1-matrixx-proc-ng-role already exists.\n\tstatus code:
409, request id: c3d54edf-74eb-439e-a41d-7452b6947eab\n\n with module.matrixx_proc_node_group.aws_iam_role.eks_node_group_role,\n
\ on ../../modules/node-group/main.tf line 87, in resource \"aws_iam_role\"
\"eks_node_group_role\":\n 87: resource \"aws_iam_role\" \"eks_node_group_role\"
{\n\n\nError: creating IAM Role (beta-test-c1-matrixx-pub-ckpt-ng-role): EntityAlreadyExists:
Role with name beta-test-c1-matrixx-pub-ckpt-ng-role already exists.\n\tstatus
code: 409, request id: 70407893-ea5d-43a4-bed5-8be500b159d7\n\n with module.matrixx_pub_ckpt_node_group.aws_iam_role.eks_node_group_role,\n
\ on ../../modules/node-group/main.tf line 87, in resource \"aws_iam_role\"
\"eks_node_group_role\":\n 87: resource \"aws_iam_role\" \"eks_node_group_role\"
{\n\n\nError: creating IAM Role (beta-test-c1-matrixx-cnf-multus-ng-role): EntityAlreadyExists:
Role with name beta-test-c1-matrixx-cnf-multus-ng-role already exists.\n\tstatus
code: 409, request id: d7a573b3-269d-4b7c-8886-baf8e2276d1b\n\n with module.matrixx_cnf_multus_node_group.aws_iam_role.eks_node_group_role,\n
\ on ../../modules/node-group/main.tf line 87, in resource \"aws_iam_role\"
\"eks_node_group_role\":\n 87: resource \"aws_iam_role\" \"eks_node_group_role\"
{\n\n\nError: creating IAM Role (beta-test-c1-system-ng-role): EntityAlreadyExists:
Role with name beta-test-c1-system-ng-role already exists.\n\tstatus code: 409,
request id: 556ec205-414c-41ff-bfa6-b16828c9b0dd\n\n with module.system_node_group.aws_iam_role.eks_node_group_role,\n
\ on ../../modules/node-group/main.tf line 87, in resource \"aws_iam_role\"
\"eks_node_group_role\":\n 87: resource \"aws_iam_role\" \"eks_node_group_role\"
{\n\n\nError: creating IAM Role (beta-test-c1-tf-controller): EntityAlreadyExists:
Role with name beta-test-c1-tf-controller already exists.\n\tstatus code: 409,
request id: ae9b05d5-8ae8-48e4-b88c-517b552172e5\n\n with module.tf_controller.aws_iam_role.tf_controller,\n
\ on ../../modules/tf-controller/main.tf line 19, in resource \"aws_iam_role\"
\"tf_controller\":\n 19: resource \"aws_iam_role\" \"tf_controller\" {\n\n\nError:
error writing to Vault: Error making API request.\n\nURL: POST https://vault.ww-sandbox.sandbox.weave.works/v1/sys/auth/beta-test-c1\nCode:
400. Errors:\n\n* path is already in use at beta-test-c1/\n\n with module.vault_k8s_auth.vault_auth_backend.kubernetes,\n
\ on ../../modules/vault-k8s-auth/main.tf line 35, in resource \"vault_auth_backend\"
\"kubernetes\":\n 35: resource \"vault_auth_backend\" \"kubernetes\" {\n\n"
reason: TerraformAppliedFail
status: "False"
type: Apply
lastAttemptedRevision: beta-template/96c13c5a2358e24b104b1dac622ece53d612c65a
lastDriftDetectedAt: "2023-02-23T13:49:16Z"
lastPlannedRevision: beta-template/96c13c5a2358e24b104b1dac622ece53d612c65a
observedGeneration: 2
plan: {}
paulc:weaveworks-20276 paulc [sandbox]$