Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Status Error 400 : AKS creation #1421

Closed
seitosan opened this issue Jun 21, 2018 · 2 comments
Closed

Status Error 400 : AKS creation #1421

seitosan opened this issue Jun 21, 2018 · 2 comments

Comments

@seitosan
Copy link

hi,
When i try to deploy Aks cluster with terraform I have this issue :

Error: Error applying plan:

1 error(s) occurred:

* azurerm_kubernetes_cluster.k8s: 1 error(s) occurred:

* azurerm_kubernetes_cluster.k8s: containerservice.ManagedClustersClient#CreateOrUpdate: Failure sending request: StatusCode=400 -- Original Error: Code="" Message=""

Terraform does not automatically rollback in the face of errors.
Instead, your Terraform state file has been partially updated with
any resources that successfully completed. Please address the error
above and apply again to incrementally change your infrastructure.

this is my terraform version :

Terraform v0.11.7

  • provider.azurerm v1.7.0
  • provider.random v1.3.1

My terraform configuration :

resource "random_string" "rgname" {
  length  = 8
  special = false
  upper   = false
  number  = false
}

resource "azurerm_resource_group" "resgrp" {
  name     = "${random_string.rgname.result}"
  location = "${var.ARM_REGION}"

  tags {
    Env = "test"
  }
}

resource "random_string" "aksname" {
  length  = 8
  special = false
  upper   = false
  number  = false
}

resource "azurerm_kubernetes_cluster" "k8s" {
  name                = "${random_string.aksname.result}"
  location            = "${azurerm_resource_group.resgrp.location}"
  resource_group_name = "${azurerm_resource_group.resgrp.name}"
  dns_prefix          = "acsclient"

  linux_profile {
    admin_username = "admink8s"

    ssh_key {
      key_data = "${var.sshkey}"
    }
  }

  agent_pool_profile {
    name            = "default"
    count           = "1"
    vm_size         = "Standard_A0"
    os_type         = "Linux"
    os_disk_size_gb = 30
  }

  service_principal {
    client_id     = "${var.client_id}"
    client_secret = "${var.client_secret}"
  }

  tags {
    Environment = "Production"
  }
}
@tombuildsstuff
Copy link
Contributor

tombuildsstuff commented Jun 21, 2018

hey @seitosan

Thanks for opening this issue :)

We've raised a bug with Microsoft about a bug in the Container Service SDK where errors aren't surfaced properly - which is being tracked in #1416. Since this is a similar case - I'm going to close this issue in favour of #1416.

Thanks!

@ghost
Copy link

ghost commented Mar 30, 2020

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.

If you feel this issue should be reopened, we encourage creating a new issue linking back to this one for added context. If you feel I made an error 🤖 🙉 , please reach out to my human friends 👉 [email protected]. Thanks!

@ghost ghost locked and limited conversation to collaborators Mar 30, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

2 participants