Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to fix accurics.azure.EKM.20 #331

Closed
siwon opened this issue Sep 16, 2020 · 2 comments · Fixed by #489
Closed

How to fix accurics.azure.EKM.20 #331

siwon opened this issue Sep 16, 2020 · 2 comments · Fixed by #489
Assignees
Labels
bug policy Issue concerning policy maintainers.

Comments

@siwon
Copy link

siwon commented Sep 16, 2020

  • terrascan version: last I presume (I use github/super-linter)
  • Operating System: github/super-linter in docker container

Description

I got this error:

  - rule_name: reme_keyVaultAuditLoggingEnabled
    description: Ensure that logging for Azure KeyVault is 'Enabled'
    rule_id: accurics.azure.EKM.20
    severity: HIGH
    category: Encryption and Key Management
    resource_name: main
    resource_type: azurerm_key_vault
    file: main.tf
    line: 145

What I Did

Here is my terraform config :

resource "azurerm_key_vault" "main" {
  name                        = "kv-${var.prefix}-${terraform.workspace}"
  location                    = azurerm_resource_group.main.location
  resource_group_name         = azurerm_resource_group.main.name
  enabled_for_disk_encryption = false
  tenant_id                   = data.azurerm_client_config.current.tenant_id
  soft_delete_enabled         = true
  purge_protection_enabled    = false

  sku_name = "standard"

  network_acls {
    default_action = "Allow"
    bypass         = "AzureServices"
  }

  tags = merge(azurerm_resource_group.main.tags, {
  })
}

resource "azurerm_monitor_diagnostic_setting" "keyvault" {
  name                       = azurerm_key_vault.main.name
  target_resource_id         = azurerm_key_vault.main.id
  storage_account_id         = azurerm_storage_account.logs.id
  log_analytics_workspace_id = azurerm_log_analytics_workspace.example.id

  log {
    category = "AuditEvent"
    enabled  = true

    retention_policy {
      enabled = true
      days    = 365
    }
  }

  metric {
    category = "AllMetrics"
    enabled  = true

    retention_policy {
      enabled = true
      days    = 365
    }
  }
}

What am I missing ?

@williepaul
Copy link
Contributor

Hello @siwon, thanks for the bug report. I took a first look at the policy, and from what I can tell, it seems to be failing trying to link the azurerm_monitor_diagnostic_setting resource to the corresponding keyvault resource. Hence, it thinks no logging is set. I'll check with the policy folks tonight.

One change that made the match work (just based on what I saw in the policy):
target_resource_id = "azurerm_key_vault.main"

It could be a rendering issue as well; will investigate.

Thanks,
-Willie

@alex-3sr
Copy link

Hi,

Yes, I've same issue here, but as purposed we can't use target_resource_id = "azurerm_key_vault.main" cause it will failed Terraform deploy.

This issue is similar to #432 where ressource group lock is on separate TF ressource.

Regards
Alexandre

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug policy Issue concerning policy maintainers.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants