Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Terraform managed worker security group doesn't honour its description #1348

Closed
jack1902 opened this issue May 12, 2021 · 2 comments
Closed

Comments

@jack1902
Copy link

Description

worker Security Group:

resource "aws_security_group" "workers" {

The worker security group created by Terraform, doesn't honour its description. It states that it is for all node groups, however it is only used if create_launch_template is set to true (its default is false 👀 ) per node group. This in turn results in a weirdness were the default behaviour of a Node/Worker is to have the AWS EKS created Cluster Security group (this is seperate to the cluster security group created by the module).

This issue becomes even more apparent when you have configured a mesh overly on your cluster where for example:

  • Node Group 1 has create_launch_template set to false (or unset, defaulting to false in the module)
  • Node Group 2 has create_launch_template set to true.

This results in the two seperate Node groups being completely unable to communicate with each other, completely breaking a mesh network 😢.

for_each = each.value["launch_template_id"] == null && each.value["create_launch_template"] ? [{

⚠️ Note

Before you submit an issue, please perform the following first:

  1. Remove the local .terraform directory (! ONLY if state is stored remotely, which hopefully you are following that best practice!): rm -rf .terraform/
  2. Re-initialize the project root to pull down modules: terraform init
  3. Re-attempt your terraform plan or apply and check if the issue still persists

Versions

  • Terraform: v0.15.3
  • Provider(s): AWS v3.39.0 (others too, but not relevant to this issue)
  • Module: 15.2.0

Reproduction

Steps to reproduce the behavior:

Code Snippet to Reproduce

Using https://github.com/terraform-aws-modules/terraform-aws-eks/blob/master/examples/managed_node_groups/main.tf set node_groups to the following:

node_groups = {
    no_worker_sg = {
      desired_capacity = 1
      max_capacity     = 1
      min_capacity     = 1

      instance_types = ["m5.large"]
      capacity_type  = "SPOT"
    },
    worker_sg = {
      desired_capacity = 1
      max_capacity     = 1
      min_capacity     = 1

      instance_types = ["m5.large"]
      capacity_type  = "SPOT"

      # This being the key difference between the two node_groups
      create_launch_template = true
    }
}

Expected behavior

I would expect that the worker security group created by Terraform is ALWAYS attached to all of my nodes, regardless of what i pass to the module.

Actual behavior

  • no_worker_sg uses the AWS EKS Security group as its primary Security group on the nodes
  • worker_sg uses the Terraform Security Group called worker as its primary node group

This results in zero traffic being able to be passed between these two seperate node groups 😭

Terminal Output Screenshot(s)

Additional context

I am running Istio on my cluster, and using a seperate Node Group because i needed a large Node dedicated to an application. This would have been working fine if the worker security group had been attached to all node groups like the description states, but it isn't.

I could open a PR against this but it WILL result in issues around backwards compatability with the module. Because the EKS Security group is attached by default, people will have made assumptions, allowing that security group to access their Database in RDS for example, but they should be using the worker security group created by Terraform. Due to the default behaviour, AWS EKS Control plane can then access your RDS etc if these assumptions have been made 😱

@jack1902
Copy link
Author

I believe this might relate slightly to #665 but i feel what i am describing is a bug entirely on its own

@github-actions
Copy link

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Nov 21, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant