Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loading incoming event into dictionary if from Cloudwatch. (#24) #25

Closed
wants to merge 1 commit into from

Conversation

ryron01
Copy link
Contributor

@ryron01 ryron01 commented Jan 12, 2019

No description provided.

@antonbabenko
Copy link
Member

Thanks for the PR and investigation!

I believe you tried it, so could you please update an example, where the same lambda function will be used for both - cloudwatch events and other types (eg, SNS).

Also, have you found some confirmation that AWS submits messages in different format? It would be great to add to README.

@ryron01
Copy link
Contributor Author

ryron01 commented Jan 13, 2019

Yes, I did test it. I used a simple event, then I tested again using an example event I found. (see below) Effectively I reverted the behavior back to 1.10.0 when the event is coming from CloudWatch using the existing logic.

I don't have any CloudWatch Alarms configured, but I did find two examples that I used. The format is basically the same but the message content is escaped out JSON.

{ "Records": [ { "EventSource": "aws:sns", "EventVersion": "1.0", "EventSubscriptionArn": "arn:aws:sns:eu-west-1:000000000000:cloudwatch-alarms:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "Sns": { "Type": "Notification", "MessageId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "TopicArn": "arn:aws:sns:eu-west-1:000000000000:cloudwatch-alarms", "Subject": "ALARM: \"Example alarm name\" in EU - Ireland", "Message": "{\"AlarmName\":\"Example alarm name\",\"AlarmDescription\":\"Example alarm description.\",\"AWSAccountId\":\"000000000000\",\"NewStateValue\":\"ALARM\",\"NewStateReason\":\"Threshold Crossed: 1 datapoint (10.0) was greater than or equal to the threshold (1.0).\",\"StateChangeTime\":\"2017-01-12T16:30:42.236+0000\",\"Region\":\"EU - Ireland\",\"OldStateValue\":\"OK\",\"Trigger\":{\"MetricName\":\"DeliveryErrors\",\"Namespace\":\"ExampleNamespace\",\"Statistic\":\"SUM\",\"Unit\":null,\"Dimensions\":[],\"Period\":300,\"EvaluationPeriods\":1,\"ComparisonOperator\":\"GreaterThanOrEqualToThreshold\",\"Threshold\":1.0}}", "Timestamp": "2017-01-12T16:30:42.318Z", "SignatureVersion": "1", "Signature": "Cg==", "SigningCertUrl": "https://sns.eu-west-1.amazonaws.com/SimpleNotificationService-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx.pem", "UnsubscribeUrl": "https://sns.eu-west-1.amazonaws.com/?Action=Unsubscribe&SubscriptionArn=arn:aws:sns:eu-west-1:000000000000:cloudwatch-alarms:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "MessageAttributes": {} } } ] }

Another example here: https://cloudonaut.io/send-cloudwatch-alarms-to-slack-with-aws-lambda/

@yujunz
Copy link

yujunz commented Jan 21, 2019

+1

It works for me :-)

Copy link
Contributor

@alexgottscha alexgottscha left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm using notify-slack specifically for passing cloudwatch alarms via SNS into slack, and ran into the error fixed here. That said,instead of relying on the presence of "AlarmName" in message, it would make sense to check the type of message and try the json load if it's a string rather than a dict. Something like this

if type(message) == str:
    try:
        message = json.loads(message)
    except JSONDecodeError as err:
        logging.error(f'message is type string but cannot load json: {err}')
        exit(1)

@antonbabenko
Copy link
Member

v1.13.0 has been released with similar fix.

@github-actions
Copy link

github-actions bot commented Nov 9, 2022

I'm going to lock this pull request because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. If you have found a problem that seems related to this change, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Nov 9, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants