Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Creating azurerm_redis_cache_access_policy_assignment failing with context deadline exceeded #25802

Closed
1 task done
kingnathanal opened this issue Apr 30, 2024 · 3 comments · Fixed by #26085
Closed
1 task done

Comments

@kingnathanal
Copy link

kingnathanal commented Apr 30, 2024

Is there an existing issue for this?

  • I have searched the existing issues

Community Note

  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request
  • Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment and review the contribution guide to help.

Terraform Version

1.6.6

AzureRM Provider Version

3.100.0

Affected Resource(s)/Data Source(s)

azurerm_redis_cache_access_policy_assignment

Terraform Configuration Files

resource "azurerm_redis_cache" "this" {
  name                          = "redis_cache_example"
  resource_group_name           = var.resource_group_name
  location                      = var.location
  capacity                      = "1"
  family                        = "P"
  sku_name                      = "Premium"
  enable_non_ssl_port           = false
  minimum_tls_version           = "1.2"
  public_network_access_enabled = false
  zones                         = []
  notify_keyspace_events        = ""
  redis_configuration {
    enable_authentication                   = true
    active_directory_authentication_enabled = true
  }
  redis_version = "6"

  identity {
    type = "SystemAssigned"
  }

  tags = {}
}

resource "azurerm_redis_cache_access_policy_assignment" "msi_access_policy" {
  for_each           = var.app_user_assigned_identity.name != "" ? toset(["enabled"]) : []
  name               = "msi_access_policy_assignment"
  redis_cache_id     = azurerm_redis_cache.this.id
  access_policy_name = "Data Contributor"
  object_id          = var.app_user_assigned_identity.principal_id
  object_id_alias    = var.app_user_assigned_identity.name
}

resource "azurerm_redis_cache_access_policy_assignment" "priv_access_policy" {
  for_each           = var.privileged_ad_group
  name               = "privileged_access_policy_assignment-${each.key}"
  redis_cache_id     = azurerm_redis_cache.this.id
  access_policy_name = "Data Contributor"
  object_id          = each.key
  object_id_alias    = each.key

  depends_on = [azurerm_redis_cache_access_policy_assignment.msi_access_policy]
}

Debug Output/Panic Output

╷
│ Error: failed to create Redis Cache Access Policy Assignment privileged_access_policy_assignment-795be7a1-d35f-4e7e-b7b2-99743c85adc4 in Redis Cache a185279-d01-musea2-redis-dholt1 in resource group a185279-d01-musea2-rg-dholt1: polling after AccessPolicyAssignmentCreateUpdate: context deadline exceeded
│ 
│   with module.primary_redis.azurerm_redis_cache_access_policy_assignment.priv_access_policy["795be7a1-d35f-4e7e-b7b2-99743c85adc4"],
│   on ../../../modules/azurerm_redis/main.tf line 116, in resource "azurerm_redis_cache_access_policy_assignment" "priv_access_policy":
│  116: resource "azurerm_redis_cache_access_policy_assignment" "priv_access_policy" {
│ 
│ failed to create Redis Cache Access Policy Assignment
│ privileged_access_policy_assignment-795be7a1-d35f-4e7e-b7b2-99743c85adc4 in
│ Redis Cache a185279-d01-musea2-redis-dholt1 in resource group
│ a185279-d01-musea2-rg-dholt1: polling after
│ AccessPolicyAssignmentCreateUpdate: context deadline exceeded

-- getting this when deleting
╷
│ Error: deleting Redis Cache Access Policy Assignment a218876-d01-musea2-redis-cvca-msi in Redis Cache a218876-d01-musea2-redis-cvca in resource group a218876-d01-musea2-rg-cvca: unexpected status 409 (409 Conflict) with error: Conflict: The resource '/subscriptions/662540e0-68a8-471e-9e0f-f22190c1679c/resourceGroups/a218876-d01-musea2-rg-cvca/providers/Microsoft.Cache/Redis/a218876-d01-musea2-redis-cvca' is busy processing a previous update request or is undergoing system maintenance.  As such, it is currently unable to accept the update request.  Please try again later.
│ RequestID=52afa9fa-8e25-4b8f-8d8c-41004fa2dfea
│ 
╷
│ deleting Redis Cache Access Policy Assignment
│ a218876-d01-musea2-redis-cvca-msi in Redis Cache
│ a218876-d01-musea2-redis-cvca in resource group a218876-d01-musea2-rg-cvca:
│ unexpected status [40](https://github.com/GM-DBT/CVCA_218876_onstar-iac-live/actions/runs/8901098000/job/24452274749#step:10:41)9 (409 Conflict) with error: Conflict: The resource
│ '/subscriptions/662540e0-68a8-471e-9e0f-f22190c1679c/resourceGroups/a218876-d01-musea2-rg-cvca/providers/Microsoft.Cache/Redis/a218876-d01-musea2-redis-cvca'
│ is busy processing a previous update request or is undergoing system
│ maintenance.  As such, it is currently unable to accept the update request.
│ Please try again later.
│ RequestID=52afa9fa-8e25-4b8f-8d8c-[41](https://github.com/GM-DBT/CVCA_218876_onstar-iac-live/actions/runs/8901098000/job/24452274749#step:10:42)004fa2dfea

Expected Behaviour

Both access policy assignments should of created successfully

Actual Behaviour

One of 2 access policy assignments created successfully the other failed with polling after AccessPolicyAssignmentCreateUpdate: context deadline exceeded

Steps to Reproduce

No response

Important Factoids

No response

References

No response

@kingnathanal kingnathanal changed the title Creating Redis Access Policy failing with context deadline exceeded Creating azurerm_redis_cache_access_policy_assignment failing with context deadline exceeded Apr 30, 2024
@the-gabe
Copy link

the-gabe commented May 7, 2024

I am also experiencing this issue, however I am able to partly mitigate by running terraform apply several times, once for each access policy. BASH scripting is what I would suggest if you have a very large list of access policies

Terraform Version:

Terraform v1.8.2
on linux_amd64
+ provider registry.terraform.io/azure/azapi v1.12.1
+ provider registry.terraform.io/hashicorp/azurerm v3.98.0
+ provider registry.terraform.io/hashicorp/random v3.6.0

Logs:

2024-05-07T14:54:13.4247197Z �[31m╷�[0m�[0m
2024-05-07T14:54:13.4248732Z �[31m│�[0m �[0m�[1m�[31mError: �[0m�[0m�[1mfailed to create Redis Cache Access Policy Assignment [email protected] in Redis Cache nameofcachegoeshere in resource group resourcegroupnamegoeshere: polling after AccessPolicyAssignmentCreateUpdate: context deadline exceeded�[0m
2024-05-07T14:54:13.4249679Z �[31m│�[0m �[0m
2024-05-07T14:54:13.4250492Z �[31m│�[0m �[0m�[0m  with azurerm_redis_cache_access_policy_assignment.nameofcachegoeshere_admins_2,
2024-05-07T14:54:13.4251267Z �[31m│�[0m �[0m  on redis.tf line 25, in resource "azurerm_redis_cache_access_policy_assignment" "nameofcachegoeshere_admins_2":
2024-05-07T14:54:13.4252055Z �[31m│�[0m �[0m  25: resource "azurerm_redis_cache_access_policy_assignment" "nameofcachegoeshere_admins_2" �[4m{�[0m�[0m
2024-05-07T14:54:13.4252509Z �[31m│�[0m �[0m
2024-05-07T14:54:13.4253013Z �[31m│�[0m �[0mfailed to create Redis Cache Access Policy Assignment [email protected] in
2024-05-07T14:54:13.4253613Z �[31m│�[0m �[0mRedis Cache nameofcachegoeshere in resource group resourcegroupnamegoeshere: polling
2024-05-07T14:54:13.4254089Z �[31m│�[0m �[0mafter AccessPolicyAssignmentCreateUpdate: context deadline exceeded
2024-05-07T14:54:13.4254408Z �[31m╵�[0m�[0m

Code:

resource "azurerm_redis_cache" "nameofcachegoeshere" {
  name                          = "nameofcachegoeshere"
  location                      = azurerm_resource_group.resourcegroupnamegoeshere.location
  resource_group_name           = azurerm_resource_group.resourcegroupnamegoeshere.name
  capacity                      = 1
  family                        = "C"
  sku_name                      = "Basic"
  enable_non_ssl_port           = false
  minimum_tls_version           = "1.2"
  public_network_access_enabled = false
  redis_version                 = "6"
  redis_configuration {
    active_directory_authentication_enabled = true
  }
}
resource "azurerm_redis_cache_access_policy_assignment" "nameofcachegoeshere_admins_1" {
  name               = "[email protected]"
  redis_cache_id     = azurerm_redis_cache.nameofcachegoeshere.id
  access_policy_name = "Data Owner"
  object_id          = "xxxxxxxxxxxxxxxxxxxxxxxx"
  object_id_alias    = "UserMSI"
}
resource "azurerm_redis_cache_access_policy_assignment" "nameofcachegoeshere_admins_2" {
  name               = "[email protected]"
  redis_cache_id     = azurerm_redis_cache.nameofcachegoeshere.id
  access_policy_name = "Data Owner"
  object_id          = "xxxxxxxxxxxxxxxxxxxxxxxx"
  object_id_alias    = "UserMSI"
}
resource "azurerm_redis_cache_access_policy_assignment" "nameofcachegoeshere_admins_3" {
  name               = "[email protected]"
  redis_cache_id     = azurerm_redis_cache.nameofcachegoeshere.id
  access_policy_name = "Data Owner"
  object_id          = "xxxxxxxxxxxxxxxxxxxxxxxx"
  object_id_alias    = "UserMSI"
}

(Sorry about the crappy formatting, Azure DevOps decided to flip me off today)

@kingnathanal
Copy link
Author

kingnathanal commented May 9, 2024

@the-gabe I think I have temp work around... I ended up adding a couple sleeps and timeout and my team has not seen any issues since so far, crossing fingers

resource "azurerm_redis_cache" "this" {
  name                          = "redis_cache_example"
  resource_group_name           = var.resource_group_name
  location                      = var.location
  capacity                      = "1"
  family                        = "P"
  sku_name                      = "Premium"
  enable_non_ssl_port           = false
  minimum_tls_version           = "1.2"
  public_network_access_enabled = false
  zones                         = []
  notify_keyspace_events        = ""
  redis_configuration {
    enable_authentication                   = true
    active_directory_authentication_enabled = true
  }
  redis_version = "6"

  identity {
    type = "SystemAssigned"
  }

  tags = {}
}

resource "time_sleep" "wait_120_access_policy" {
  depends_on = [azurerm_redis_cache_access_policy_assignment.msi_access_policy]

  create_duration = 120s
}

resource "azurerm_redis_cache_access_policy_assignment" "msi_access_policy" {
  for_each           = var.app_user_assigned_identity.name != "" ? toset(["enabled"]) : []
  name               = "msi_access_policy_assignment"
  redis_cache_id     = azurerm_redis_cache.this.id
  access_policy_name = "Data Contributor"
  object_id          = var.app_user_assigned_identity.principal_id
  object_id_alias    = var.app_user_assigned_identity.name
   
  timeouts {
      create = 30m
  } 

  depends_on = [time_sleep.wait_120_seconds]
}

resource "time_sleep" "wait_120_second_access_policy" {
  depends_on = [azurerm_redis_cache_access_policy_assignment.msi_access_policy]

  create_duration = 120s
}

resource "azurerm_redis_cache_access_policy_assignment" "priv_access_policy" {
  for_each           = var.privileged_ad_group
  name               = "privileged_access_policy_assignment-${each.key}"
  redis_cache_id     = azurerm_redis_cache.this.id
  access_policy_name = "Data Contributor"
  object_id          = each.key
  object_id_alias    = each.key

  timeouts {
      create = 30m
  }

  depends_on = [time_sleep.wait_120_second_access_policy]
}

Copy link

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.
If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Jun 27, 2024
@rcskosir rcskosir added the bug label Aug 23, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
3 participants