Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Data Catalog: Running into 403 while updating google_data_catalog_entry.entry_id related to a google_data_catalog_tag #8900

Closed
Assignees
Labels

Comments

@gwendal-lecren
Copy link

Community Note

  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request.
  • Please do not leave +1 or me too comments, they generate extra noise for issue followers and do not help prioritize the request.
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment.
  • If an issue is assigned to the modular-magician user, it is either in the process of being autogenerated, or is planned to be autogenerated soon. If an issue is assigned to a user, that user is claiming responsibility for the issue. If an issue is assigned to hashibot, a community member has claimed the issue already.

Terraform Version

Terraform v0.14.10
provider registry.terraform.io/hashicorp/google v3.64.0
provider registry.terraform.io/hashicorp/google-beta v3.64.0

Affected Resource(s)

  • google_data_catalog_tag

Terraform Configuration Files

resource "google_data_catalog_entry" "entry_table" {
  for_each              = var.tables
  entry_group           = google_data_catalog_entry_group.entry_group.id
  entry_id              = join("_", [regex("c[123]{1}", each.value.dataset_id), each.value.table_id])
  linked_resource       = "//bigquery.googleapis.com/projects/${local.project}/datasets/${each.value.dataset_id}/tables/${each.value.table_id}"
  user_specified_type   = "BIGQUERY_TABLE"
  user_specified_system = "BIGQUERY_TABLE_SYSTEM"
  description           = each.value.description
  depends_on            = [google_bigquery_table.tables]
}

resource "google_data_catalog_tag" "table_security_tag" {
  count    = length(var.tables)
  parent   = values(google_data_catalog_entry.entry_table)[count.index].id
  template = var.tag_table_security

  fields {
    field_name = "confidentiality"
    enum_value = values(var.tables)[count.index].confidentiality
  }
  fields {
    field_name = "privacy"
    bool_value = values(var.tables)[count.index].privacy
  }
  fields {
    field_name = "gdpr"
    bool_value = values(var.tables)[count.index].gdpr
  }
}

Debug Output

Can't show everything but here is the relevant part with the 403:
https://gist.github.com/gwendal-lecren/123d7d14762492b97379d3530826a521

Panic Output

Expected Behavior

While renaming the entry_id of the google_data_catalog_entry, the related google_data_catalog_tag should get updated properly. They are related through the parent field from the google_data_catalog_tag resource.

Actual Behavior

Running into a 403 at the first apply:
Permission denied for projects/dummy-project/datasets/base_layer_303_dummy@europe-west1/entries/c2_dummy_v1, or resource doesn't exist.

After a retry, everything gets updated properly.

Steps to Reproduce

  1. Create a google_data_catalog_entry and a related google_data_catalog_tag through the parent field.
  2. terraform apply
  3. Rename the entry_id of the google_data_catalog_entry
  4. terraform apply

Important Factoids

It works at the second run. Is the API returning a stale state?

References

  • #0000
@ghost ghost added the bug label Apr 13, 2021
@edwardmedia edwardmedia self-assigned this Apr 13, 2021
@edwardmedia
Copy link
Contributor

@gwendal-lecren can you try by adding sleep logic between these resources?

@gwendal-lecren
Copy link
Author

Thanks @edwardmedia for getting back to me. Unfortunately, I'm still running into the same issue. Here is the iac updated with a sleep of 120sec between the two resources:

resource "google_data_catalog_entry" "entry_table" {
  for_each              = var.tables
  entry_group           = google_data_catalog_entry_group.entry_group.id
  entry_id              = join("_", [regex("c[123]{1}", each.value.dataset_id), each.value.table_id])
  linked_resource       = "//bigquery.googleapis.com/projects/${local.project}/datasets/${each.value.dataset_id}/tables/${each.value.table_id}"
  user_specified_type   = "BIGQUERY_TABLE"
  user_specified_system = "BIGQUERY_TABLE_SYSTEM"
  description           = each.value.description
  depends_on            = [google_bigquery_table.tables]
}

resource "time_sleep" "entry_table_wait_120_sec" {
  count    = length(var.tables)
  create_duration  = "120s"
  destroy_duration = "120s"
  triggers = {
    entry_table_id = values(google_data_catalog_entry.entry_table)[count.index].id
  }
}

resource "google_data_catalog_tag" "table_security_tag" {
  count    = length(var.tables)
  parent   = time_sleep.entry_table_wait_120_sec[count.index].triggers["entry_table_id"]
  template = var.tag_table_security

  fields {
    field_name = "confidentiality"
    enum_value = values(var.tables)[count.index].confidentiality
  }
  fields {
    field_name = "privacy"
    bool_value = values(var.tables)[count.index].privacy
  }
  fields {
    field_name = "gdpr"
    bool_value = values(var.tables)[count.index].gdpr
  }
}

Trace:
Step #5 - "tf apply": Error: Error updating Tag "projects/project-dummy/locations/europe-west1/entryGroups/base_layer_303_iri/entries/c2_iri_v1/tags/CWr1bZI6RgQG": googleapi: Error 403: Permission denied for projects/project-dummy/datasets/base_layer_303_iri@europe-west1/entries/c2_iri_v1, or resource doesn't exist.

@ghost ghost removed the waiting-response label Apr 14, 2021
@edwardmedia
Copy link
Contributor

@gwendal-lecren you have bunch of variables in your config that makes me hard to repro the issue. Can you try to provide a config with hard-coded values that I can use to repro?

@gwendal-lecren
Copy link
Author

gwendal-lecren commented Apr 15, 2021

@edwardmedia I tried to narrow down the issue. Here is a configuration with hardcoded values and two different google_data_catalog_entry (slightly different from the original configuration):

resource "google_data_catalog_entry_group" "entry_group_issue" {
  project        = local.project
  entry_group_id = "entry_group_issue"
}

resource "google_data_catalog_entry" "entry_issue_1" {
  entry_group           = google_data_catalog_entry_group.entry_group_issue.id
  entry_id              = "dataset_1_table_id_1"
  linked_resource       = "//bigquery.googleapis.com/projects/${local.project}/datasets/dataset_1/tables/table_id_1"
  user_specified_type   = "BIGQUERY_TABLE"
  user_specified_system = "BIGQUERY_TABLE_SYSTEM"
  description           = "table_id_1_description"
}

resource "google_data_catalog_entry" "entry_issue_2" {
  entry_group           = google_data_catalog_entry_group.entry_group_issue.id
  entry_id              = "dataset_1_table_id_2"
  linked_resource       = "//bigquery.googleapis.com/projects/${local.project}/datasets/dataset_1/tables/table_id_2"
  user_specified_type   = "BIGQUERY_TABLE"
  user_specified_system = "BIGQUERY_TABLE_SYSTEM"
  description           = "table_id_2_description"
}

resource "google_data_catalog_tag" "tag_issue" {
  parent   = google_data_catalog_entry.entry_issue_1.id
  template = var.tag_table_security

  fields {
    field_name = "confidentiality"
    enum_value = "C2"
  }
  fields {
    field_name = "privacy"
    bool_value = true
  }
  fields {
    field_name = "gdpr"
    bool_value = false
  }
}

While updating the parent field from the resource google_data_catalog_tag.tag_issue to google_data_catalog_entry.entry_issue_2.id instead of google_data_catalog_entry.entry_issue_1.id, I'm running into this issue:

Step #5 - "tf apply": module.main.google_data_catalog_tag.tag_issue: Modifying... [id=projects/dummy-project/locations/europe-west1/entryGroups/entry_group_issue/entries/dataset_1_table_id_1/tags/CWr1bZI6RgQG]
Step #5 - "tf apply":
Step #5 - "tf apply": Error: Provider produced inconsistent result after apply
Step #5 - "tf apply":
Step #5 - "tf apply": When applying changes to module.main.google_data_catalog_tag.tag_issue,
Step #5 - "tf apply": provider "registry.terraform.io/hashicorp/google" produced an unexpected new
Step #5 - "tf apply": value: Root resource was present, but now absent.
Step #5 - "tf apply":
Step #5 - "tf apply": This is a bug in the provider, which should be reported in the provider's own
Step #5 - "tf apply": issue tracker.
Step #5 - "tf apply":

Here is a configuration with hardcoded values and only one google_data_catalog_entry (original configuration):

resource "google_data_catalog_entry_group" "entry_group_issue" {
  project        = local.project
  entry_group_id = "entry_group_issue"
}

resource "google_data_catalog_entry" "entry_issue_1" {
  entry_group           = google_data_catalog_entry_group.entry_group_issue.id
  entry_id              = "dataset_1_table_id_1"
  linked_resource       = "//bigquery.googleapis.com/projects/${local.project}/datasets/dataset_1/tables/table_id_1"
  user_specified_type   = "BIGQUERY_TABLE"
  user_specified_system = "BIGQUERY_TABLE_SYSTEM"
  description           = "table_id_1_description"
}

resource "google_data_catalog_tag" "tag_issue" {
  parent   = google_data_catalog_entry.entry_issue_1.id
  template = var.tag_table_security

  fields {
    field_name = "confidentiality"
    enum_value = "C2"
  }
  fields {
    field_name = "privacy"
    bool_value = true
  }
  fields {
    field_name = "gdpr"
    bool_value = false
  }
}

While updating the entry_id field from the resource google_data_catalog_entry.entry_issue_1 to dataset_1_table_id_2 instead of dataset_1_table_id_1, I'm running into the 403 issue:

Step #5 - "tf apply": Error: Error updating Tag "projects/dummy-project/locations/europe-west1/entryGroups/entry_group_issue/entries/dataset_1_table_id_1/tags/CWr1bZI6RgQG": googleapi: Error 403: Permission denied for projects/dummy-project/datasets/entry_group_issue@europe-west1/entries/dataset_1_table_id_1, or resource doesn't exist.
Step #5 - "tf apply":
Step #5 - "tf apply":   on module/main/tags_test.tf line 15, in resource "google_data_catalog_tag" "tag_issue":
Step #5 - "tf apply":   15: resource "google_data_catalog_tag" "tag_issue" {

A retry on the second configuration makes it works, a retry on the first configuration does not.
A sleep logic on the second configuration makes it still running into the 403.

@ghost
Copy link

ghost commented May 21, 2021

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.

If you feel this issue should be reopened, we encourage creating a new issue linking back to this one for added context. If you feel I made an error 🤖 🙉 , please reach out to my human friends 👉 [email protected]. Thanks!

@ghost ghost locked as resolved and limited conversation to collaborators May 21, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.