Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Label extractor in google_logging_metric does not parse json message #9890

Closed
dcallao opened this issue Aug 23, 2021 · 11 comments
Closed

Label extractor in google_logging_metric does not parse json message #9890

dcallao opened this issue Aug 23, 2021 · 11 comments

Comments

@dcallao
Copy link

dcallao commented Aug 23, 2021

Community Note

  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request.
  • Please do not leave +1 or me too comments, they generate extra noise for issue followers and do not help prioritize the request.
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment.
  • If an issue is assigned to the modular-magician user, it is either in the process of being autogenerated, or is planned to be autogenerated soon. If an issue is assigned to a user, that user is claiming responsibility for the issue. If an issue is assigned to hashibot, a community member has claimed the issue already.

Terraform Version

Terraform v0.13.0

Affected Resource(s)

google_logging_metric and google_monitoring_alert_policy

Terraform Configuration Files

resource "google_logging_metric" "newmember_logging_metric" {
  name   = "new-group-member-metric"
  filter = "jsonPayload.alert_type=\"user_added_to_group\""
  metric_descriptor {
    metric_kind  = "DELTA"
    value_type   = "INT64"
    unit         = "1"
    display_name = "Set add member to group event"
    labels {
      key         = "message"
      value_type  = "INT64"
      description = "log message"
    }
  }
  label_extractors = {
    "message" = "EXTRACT(jsonPayload.message)"
  }
}

Expected Behavior

The label extractor is referenced within the google_monitoring_alert_policy resource:

# Add alert for metric
resource "google_monitoring_alert_policy" "alert_policy_new_group-member" {
  display_name          = "Alert - New Group Member Added"
  combiner              = "OR"
  notification_channels = flatten([google_monitoring_notification_channel.slack[*].name, google_monitoring_notification_channel.email[*].name])
  conditions {
    display_name = "Alert - Foreign Member Added in Workspace Group"
    condition_threshold {
      filter          = "metric.type=\"logging.googleapis.com/user/${google_logging_metric.newmember_logging_metric.name}\" resource.type=\"cloud_function\""
      duration        = "60s"
      comparison      = "COMPARISON_GT"
      threshold_value = 0
      trigger {
        count   = 1
        percent = 0
      }
      aggregations {
        alignment_period   = "60s"
        per_series_aligner = "ALIGN_COUNT"
      }
    }
  }
  documentation {
    mime_type = "text/markdown"
    content   = "$${metric.label.message}"
  }
}

That should have printed out the entire json string within the json payload message under "Policy Document" of the Alert message:

"{"insertId":"####","logName":"organizations/#####/logs/cloudaudit.googleapis.com%2Factivity","protoPayload":{"@type":"type.googleapis.com/google.cloud.audit.AuditLog","authenticationInfo":{"principalEmail":"[email protected]"},"authorizationInfo":[{"granted":true,"permission":"cloudidentity.membership.update","resource":"cloudidentity.googleapis.com/groups/#####"}],"metadata":{"@type":"type.googleapis.com/google.cloud.audit.GroupAuditMetadata","group":"group:[email protected]","membershipDelta":{"member":"user:[email protected]","roleDeltas":[{"action":"ADD","role":"MEMBER"}]}},"methodName":"google.apps.cloudidentity.groups.v1.MembershipsService.UpdateMembership","requestMetadata":{"callerIp":"192.184.141.227","callerSuppliedUserAgent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/92.0.4515.107 Safari/537.36,gzip(gfe),gzip(gfe)"},"resourceName":"groups/[email protected]","serviceName":"cloudidentity.googleapis.com"},"receiveTimestamp":"2021-08-05T22:36:34.716026782Z","resource":{"labels":{"method":"google.apps.cloudidentity.groups.v1.MembershipsService.UpdateMembership","service":"cloudidentity.googleapis.com"},"type":"audited_resource"},"severity":"NOTICE","timestamp":"2021-08-05T22:36:34.161045Z"}"

Actual Behavior

The Alert message "Policy Document" section shows "(null)"

Alert - Foreign Member Added in Workspace Group
Set add member to group event for testing-123456 Cloud Function labels {project_id=testing-123456, function_name=test-logs-function, region=us-east1} with metric labels {log=cloudfunctions.googleapis.com/cloud-functions} is above the threshold of 0.000 with a value of 1.000.

Summary
Start time
Aug 16, 2021 at 9:54PM UTC (less than 1 sec ago)

Project
testing-123456

Policy
Alert - New Group Member Added

Condition
Alert - Foreign Member Added in Workspace Group

Metric
logging.googleapis.com/user/new-group-member-metric

Threshold
above 0

Observed
1.000

Policy documentation
(null)

b/275110452

@dcallao dcallao added the bug label Aug 23, 2021
@megan07 megan07 self-assigned this Aug 24, 2021
@megan07
Copy link
Contributor

megan07 commented Aug 27, 2021

Hi @dcallao! I'm sorry you're running into this. Would you be able to show me how your google_monitoring_alert_policy is configured as well? I see in the documentation that these can sometimes be null if no values are returned from the time series query. I'm still working on reproducing this, but wanted to double-check with you on this first. Thanks!

@dcallao
Copy link
Author

dcallao commented Aug 27, 2021

Hi @megan07! I have posted the full block of code of the google_monitoring_alert_policy resource in the issue description. Thanks!

@megan07
Copy link
Contributor

megan07 commented Sep 10, 2021

Hi @dcallao! Sorry for the delay, I was out of office for a bit. I picked this back up today and, although I am not using your exact metrics, I am also seeing the null value for my Policy Documentation. However, I'm creating it in the console. Have you been able to recreate this in the console and see an actual message there?
Thanks!

@dcallao
Copy link
Author

dcallao commented Sep 14, 2021

I recreated in the console but dont see the message there either.

@megan07
Copy link
Contributor

megan07 commented Sep 14, 2021

Is it possible your labels.value_type should be STRING?

@dcallao
Copy link
Author

dcallao commented Sep 20, 2021

Hi @megan07! I did try to change it to STRING but I got the null error again.

@dcallao
Copy link
Author

dcallao commented Oct 4, 2021

@megan07, any luck with this issue?

@dcallao
Copy link
Author

dcallao commented Oct 14, 2021

@megan07, checking back with you again. Any luck with this?

@megan07
Copy link
Contributor

megan07 commented Oct 15, 2021

Hi @dcallao, I'm bug on duty this week and just reviewed your comments, from my understanding 1) if you're seeing the null even when setting it up in the console, I'd suggest getting it working in the console and mapping that back to Terraform. 2) If I'm misunderstanding, and it is working in the console and not in Terraform, I'd be interested in seeing what values you have used in the console.

It seems to me that the google_logging_metric.labels.value_type = "INT64" wouldn't be correct for the jsonPayload.message as that should be a string and not an int. And that's the fix that helped me. I need details to understand this issue is terraform-specific to further research this issue.

Thanks!

@melinath
Copy link
Collaborator

It looks like this is the expected behavior for the API. From https://cloud.google.com/monitoring/alerts/doc-variables#null-values:

The resource.label.KEY and metric.label.KEY variables can have null values if your alerting policy uses cross-series aggregation (reduction), for example, calculating the SUM across each of the time-series that match a filter). When using cross-series aggregation, any labels not used in grouping are dropped and as a result they render as null when the variable is replaced with its value. All labels are retained when there is no cross-series aggregation.

As @megan07 said above - one possible method to debug the resource if Terraform doesn't have the expected result is to create the resource with a different tool and make sure the results are as expected, then map the resource back into Terraform.

@github-actions
Copy link

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.
If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators May 15, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

3 participants