aws-cdk icon indicating copy to clipboard operation
aws-cdk copied to clipboard

(aws-eks): Cannot remove logging configuration after setting them

Open akefirad opened this issue 2 years ago • 1 comments

Describe the bug

Updating a cluster to remove cluster control plane logging cannot succeed. Also the rollback also fails with slightly different error.

Expected Behavior

The cluster should be updated or at least the rollback should complete.

Current Behavior

It fails with "The type for cluster update was not provided." error. Also the rollback also fails with slightly different error: "No changes needed for the logging config provided."

Reproduction Steps

  1. Create and deploy an empty cluster:
export class EksClusterStack extends cdk.Stack {
  constructor(scope: Construct, id: string, props: EksClusterStackProps) {
    super(scope, id, props);

    // Cluster /////////////////////////////////////////////////////////////////
    const clusterAdminRole = new iam.Role(this, "ClusterAdminRole", {
      assumedBy: new iam.AccountRootPrincipal(),
    });

    const vpc = ec2.Vpc.fromLookup(this, "MainVpc", { vpcId: props.vpcId });
    this.cluster = new eks.Cluster(this, "EksCluster", {
      vpc: vpc,
      vpcSubnets: [{ subnetType: ec2.SubnetType.PRIVATE_WITH_NAT }],
      clusterName: `${id}`,
      mastersRole: clusterAdminRole,
      version: eks.KubernetesVersion.V1_22,
      kubectlLayer: new lambda.LayerVersion(this, "KubectlLayer", {
        code: lambda.Code.fromAsset(path.join(__dirname, "layers", "kubectl.zip")),
      }),
    });
  }
}
  1. Change and deploy the cluster stack and add logging configuration:
      clusterLogging: [
        eks.ClusterLoggingTypes.API,
        eks.ClusterLoggingTypes.AUDIT,
        eks.ClusterLoggingTypes.AUTHENTICATOR,
        eks.ClusterLoggingTypes.CONTROLLER_MANAGER,
        eks.ClusterLoggingTypes.SCHEDULER,
      ],
  1. Now remove the above entry to make it like the original setup and deploy it (or you can just remove some and keep some). The Custom::AWSCDK-EKS-Cluster resource fails to update with the following error:
Received response status [FAILED] from custom resource. Message returned: No changes needed for the logging config provided Logs: /aws/lambda/InfraMainCluster-awscdkawse-OnEventHandler at Object.extractError (/var/runtime/node_modules/aws-sdk/lib/protocol/json.js:52:27) at Request.extractError (/var/runtime/node_modules/aws-sdk/lib/protocol/rest_json.js:49:8) at Request.callListeners (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:106:20) at Request.emit (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:78:10) at Request.emit (/var/runtime/node_modules/aws-sdk/lib/request.js:686:14) at Request.transition (/var/runtime/node_modules/aws-sdk/lib/request.js:22:10) at AcceptorStateMachine.runTo (/var/runtime/node_modules/aws-sdk/lib/state_machine.js:14:12) at /var/runtime/node_modules/aws-sdk/lib/state_machine.js:26:10 at Request.<anonymous> (/var/runtime/node_modules/aws-sdk/lib/request.js:38:9) at Request.<anonymous> (/var/runtime/node_modules/aws-sdk/lib/request.js:688:12)
  1. Try to complete the rollback, but it'll fails with slightly different error.

Possible Solution

🤷‍♂️

Additional Information/Context

No response

CDK CLI Version

2.20.0 (build 738ef49)

Framework Version

2.20.0

Node.js Version

v16.13.0

OS

Darwin Version 21.4.0

Language

Typescript

Language Version

Version 3.9.10

Other information

No response

akefirad avatar Apr 13 '22 13:04 akefirad

⚠️COMMENT VISIBILITY WARNING⚠️

Comments on closed issues are hard for our team to see. If you need more assistance, please either tag a team member or open a new issue that references this one. If you wish to keep having a conversation with other community members under this issue feel free to do so.

github-actions[bot] avatar Aug 01 '22 22:08 github-actions[bot]

We are running into the same issue on an existing cluster (EKS 1.22 and CDK 2.53). Deploy stack with:

  1. API logging enabled
  2. Authentication logging enabled

Afterwards, deploy with cluster logging disabled.

Is there any update on when this issue will be resolved?

rtroost2012 avatar Nov 29 '22 13:11 rtroost2012

In case this helps someone: I was getting "Received response status [FAILED] from custom resource. Message returned: The type for cluster update was not provided." even with this fix in place (CDK 2.59.0).

Resolved by manually turning off all logging on the EKS cluster, then applying the change in Cloudformation.

plumdog avatar Jan 17 '23 13:01 plumdog

Had same issue on my end with:

Received response status [FAILED] from custom resource. Message returned: No changes needed for the logging config provided Logs

I deployed it with: clusterLogging: [ eks.ClusterLoggingTypes.API, eks.ClusterLoggingTypes.AUDIT, eks.ClusterLoggingTypes.AUTHENTICATOR, eks.ClusterLoggingTypes.CONTROLLER_MANAGER, eks.ClusterLoggingTypes.SCHEDULER, ],

Trying to remove the API logging line and running another CDK Deploy results in the error.

I have found a solution by using a custom resource instead to update this which includes the onDelete line Credit: https://github.com/aws/aws-cdk/issues/4159#issuecomment-855625700

AlecZebrick avatar Jan 31 '23 02:01 AlecZebrick

Trying to remove logging via CDK like this: image

Results in this change plan: image

Resulting in 1:49:09 PM | UPDATE_FAILED | Custom::AWSCDK-EKS-Cluster | EksClusterFAB68BDB Received response status [FAILED] from custom resource. Message returned: No changes needed for the logging config provided

Removing the clusterLogging option completely results in the same error.

AlecZebrick avatar Feb 07 '23 04:02 AlecZebrick