Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Module is forcing cluster creation when updating cluster with secret encryption key #1249

Closed
1 of 4 tasks
kkapoor1987 opened this issue Feb 18, 2021 · 10 comments
Closed
1 of 4 tasks

Comments

@kkapoor1987
Copy link

kkapoor1987 commented Feb 18, 2021

I have issues while updating the current cluster to use an Encryption key for secrets

I'm submitting a...

  • bug report
  • feature request
  • support request - read the FAQ first!
  • kudos, thank you, warm fuzzy

What is the current behavior?

Currently, if supplying Encryption key for secrets module is forcing to recreate the cluster which is obviously causing the issue as a cluster with same name exists

  # module.eks.aws_eks_cluster.this[0] must be replaced
+/- resource "aws_eks_cluster" "this" {
      ~ arn                       = "arn:aws:eks:us-east-1:*******:cluster/test-eks-mytest-kapok" -> (known after apply)
      ~ certificate_authority     = [
          - {
              - data = "**********************"
            },
        ] -> (known after apply)
      ~ created_at                = "2021-02-18 14:45:16.33 +0000 UTC" -> (known after apply)
      - enabled_cluster_log_types = [] -> null
      ~ endpoint                  = "**********************" -> (known after apply)
      ~ id                        = "test-eks-mytest-kapok" -> (known after apply)
      ~ identity                  = [
          - {
              - oidc = [
                  - {
                      - issuer = "https://oidc.eks.us-east-1.amazonaws.com/id/**********************"
                    },
                ]
            },
        ] -> (known after apply)
        name                      = "test-eks-mytest-kapok"
      ~ platform_version          = "eks.5" -> (known after apply)
        role_arn                  = "arn:aws:iam::********:role/test-eks-mytest-kapok20210218144353497300000002"
      ~ status                    = "ACTIVE" -> (known after apply)
      ~ tags                      = {
          - "CostCenterCode" = "6779" -> null
          - "Department"     = "cloudengineering-us" -> null
            "Environment"    = "test"
            "GithubOrg"      = "terraform-aws-modules"
            "GithubRepo"     = "terraform-aws-eks"
          - "Manager"        = "gonzc" -> null
            "Owner"          = "kapok"
          - "OwnerEmailID"   = "[email protected]" -> null
        }
        version                   = "1.17"

      + encryption_config { # forces replacement
          + resources = [
              + "secrets",
            ] # forces replacement

          + provider { # forces replacement
              + key_arn = "arn:aws:kms:us-east-1:********:key/789e115e-7934-4146-b896-46baafd33e19" # forces replacement
            }
        }

      ~ kubernetes_network_config {
          ~ service_ipv4_cidr = "172.20.0.0/16" -> (known after apply)
        }

        timeouts {
            create = "30m"
            delete = "15m"
        }

      ~ vpc_config {
          ~ cluster_security_group_id = "sg-086046380bf2d6706" -> (known after apply)
            endpoint_private_access   = false
            endpoint_public_access    = true
            public_access_cidrs       = [
                "0.0.0.0/0",
            ]
            security_group_ids        = [
                "sg-03d94cde957c15074",
            ]
            subnet_ids                = [
                "subnet-06b9356a56089ac3b",
                "subnet-0934a31b0f092dcf0",
                "subnet-0f43ae8446d5fbe60",
            ]
          ~ vpc_id                    = "vpc-0c2e790ccdda7a9c9" -> (known after apply)
        }
    }

Terraform Error:

module.eks.aws_eks_cluster.this[0]: Creating...

Error: error creating EKS Cluster (test-eks-mytest-kapok): ResourceInUseException: Cluster already exists with name: test-eks-mytest-kapok
{
RespMetadata: {
  StatusCode: 409,
  RequestID: "d7f0364c-c2a0-45c8-86ea-563f849079b7"
},
ClusterName: "test-eks-mytest-kapok",
Message_: "Cluster already exists with name: test-eks-mytest-kapok"
}

If this is a bug, how to reproduce? Please include a code sample if relevant.

Create eks cluster without encryption key for secrets and then try to update same cluster using module with passing encryption key and module will try to create new cluster instead of updating same cluster

What's the expected behavior?

Module should be able to update same cluster without forcing creation of new cluster

Are you able to fix this problem and submit a PR? Link here if you have already.

Environment details

  • Affected module version:14.0.0
  • OS:
  • Terraform version:0.14.6

Any other relevant info

@kkapoor1987
Copy link
Author

kkapoor1987 commented Feb 18, 2021

@barryib @babilen5 @dpiddockcmp Any thoughts on this ?

@mansab
Copy link

mansab commented Feb 19, 2021

We also just got hit by the same issue. We had our cluster setup without the encryption config and now that we have added the config to use AWS KMS, but the module forces the replacement of the cluster and can not apply the Terraform configuration as it struggles with creating a new cluster with the same name as the old one.

@kkapoor1987
Copy link
Author

Checking in the provider code looks like https://github.com/hashicorp/terraform-provider-aws/blob/master/aws/resource_aws_eks_cluster.go#L60 is causing the issue!!!

May be it's for the aws provider to fix it

@jeff-french
Copy link

This is a limitation of the EKS platform. You must enable the customer managed key at cluster creation:

...and do remember that, at the current juncture, you can only set this at the cluster creation time (that is, not supported via cluster config updates)

https://aws.amazon.com/blogs/containers/using-eks-encryption-provider-support-for-defense-in-depth/

@zerog2k
Copy link

zerog2k commented Mar 3, 2021

Amazon EKS now supports adding KMS envelope encryption to existing clusters to enhance security for secrets
https://aws.amazon.com/about-aws/whats-new/2021/03/amazon-eks-supports-adding-kms-envelope-encryption-to-existing-clusters/

@mansab
Copy link

mansab commented Mar 3, 2021

So shall this be re-opened?

@knepe
Copy link

knepe commented Mar 5, 2021

We hit this aswell now, works fine to modify the cluster in AWS Console after creation, but terraform wants to re-create the cluster

@zerog2k
Copy link

zerog2k commented Mar 10, 2021

so looks like it should be addressed with aws provider changes:
hashicorp/terraform-provider-aws#17952

@froblesmartin
Copy link

@github-actions
Copy link

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Nov 20, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants