Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AKS Agent Pool Profile missing maxPods #1729

Closed
dillon-courts opened this issue Aug 6, 2018 · 3 comments · Fixed by #1753
Closed

AKS Agent Pool Profile missing maxPods #1729

dillon-courts opened this issue Aug 6, 2018 · 3 comments · Fixed by #1753

Comments

@dillon-courts
Copy link

Community Note

  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request
  • Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment

Terraform Version

Terraform v0.11.7

  • provider.azurerm v1.12.0
  • provider.random v1.3.1

Affected Resource(s)

  • azurerm_kubernetes_cluster

Terraform Configuration Files

resource "azurerm_kubernetes_cluster" "aks_container" {
  name       = "akc-${random_integer.random_int.result}"
  location   = "${var.resource_group_location}"
  dns_prefix = "akc-${random_integer.random_int.result}"

  resource_group_name = "${azurerm_resource_group.akc-rg.name}"

  linux_profile {
    admin_username = "${var.linux_admin_username}"

    ssh_key {
      key_data = "${var.linux_admin_ssh_publickey}"
    }
  }

  agent_pool_profile {
    name    = "agentpool"
    count   = "2"
    vm_size = "Standard_DS2_v2"
    os_type = "Linux"
    maxPods = "90"

    # Required for advanced networking
    vnet_subnet_id = "${azurerm_subnet.aks_subnet.id}"
  }

  service_principal {
    client_id       = "${var.client_id}"
    client_secret   = "${var.client_secret}"
  }

  network_profile {
    network_plugin     = "azure"
    dns_service_ip     = "10.0.0.10"
    docker_bridge_cidr = "172.17.0.1/16"
    service_cidr       = "10.0.0.0/16"
  }
}

Expected Behavior

I should be able to declare a value for 'maxPods' in the agent_pool_profile based on microsoft documentation: https://docs.microsoft.com/en-us/azure/aks/networking-overview#configure-maximum

Actual Behavior

This parameter is not recognized by terraform. Doc link: https://www.terraform.io/docs/providers/azurerm/r/kubernetes_cluster.html#agent_pool_profile

Debug output: Error: azurerm_kubernetes_cluster.aks_container: agent_pool_profile.0: invalid or unknown key: maxPods

Steps to Reproduce

N/A

Important Factoids

N/A

References

N/A

  • #0000
@lfshr
Copy link
Contributor

lfshr commented Aug 10, 2018

Azure portal: You cannot modify the maximum number of pods per node when you deploy a cluster with the Azure portal. Advanced networking clusters are limited to 30 pods per node when deployed in the Azure portal.

One to be aware of, does this limit apply to all advanced networking clusters? I think it does, we don't specify the number of NICs to spin out in an advanced cluster, there are always 31 per agent (30 for pods, and 1 for the agent)

@dillon-courts
Copy link
Author

dillon-courts commented Aug 10, 2018

Azure portal: You cannot modify the maximum number of pods per node when you deploy a cluster with the Azure portal. Advanced networking clusters are limited to 30 pods per node when deployed in the Azure portal.

One to be aware of, does this limit apply to all advanced networking clusters? I think it does, we don't specify the number of NICs to spin out in an advanced cluster, there are always 31 per agent (30 for pods, and 1 for the agent)

@lfshr

I'm pretty sure the portal limitation does not apply to all advanced networking clusters. The documentation specifies that you can change the maxPods value using ARM templates or the Azure CLI. I was able to spin up an AKS cluster using an ARM template with 61 NICs per node when setting the maxPods value to 60.

@ghost
Copy link

ghost commented Mar 30, 2020

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.

If you feel this issue should be reopened, we encourage creating a new issue linking back to this one for added context. If you feel I made an error 🤖 🙉 , please reach out to my human friends 👉 [email protected]. Thanks!

@ghost ghost locked and limited conversation to collaborators Mar 30, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants