-
-
Notifications
You must be signed in to change notification settings - Fork 343
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Loading incoming event into dictionary if from Cloudwatch. (#24) #25
Conversation
Thanks for the PR and investigation! I believe you tried it, so could you please update an example, where the same lambda function will be used for both - cloudwatch events and other types (eg, SNS). Also, have you found some confirmation that AWS submits messages in different format? It would be great to add to README. |
Yes, I did test it. I used a simple event, then I tested again using an example event I found. (see below) Effectively I reverted the behavior back to 1.10.0 when the event is coming from CloudWatch using the existing logic. I don't have any CloudWatch Alarms configured, but I did find two examples that I used. The format is basically the same but the message content is escaped out JSON.
Another example here: https://cloudonaut.io/send-cloudwatch-alarms-to-slack-with-aws-lambda/ |
+1 It works for me :-) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm using notify-slack specifically for passing cloudwatch alarms via SNS into slack, and ran into the error fixed here. That said,instead of relying on the presence of "AlarmName" in message
, it would make sense to check the type of message
and try the json load if it's a string rather than a dict. Something like this
if type(message) == str:
try:
message = json.loads(message)
except JSONDecodeError as err:
logging.error(f'message is type string but cannot load json: {err}')
exit(1)
v1.13.0 has been released with similar fix. |
I'm going to lock this pull request because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. If you have found a problem that seems related to this change, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further. |
No description provided.