Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

string indices must be integers: TypeError #24

Closed
PatrickMans opened this issue Jan 7, 2019 · 14 comments
Closed

string indices must be integers: TypeError #24

PatrickMans opened this issue Jan 7, 2019 · 14 comments

Comments

@PatrickMans
Copy link

Just rolled out the latest pull, which still contains an error:

string indices must be integers: TypeError
Traceback (most recent call last):
File "/var/task/automanager_notification.py", line 78, in lambda_handler
notify_slack(message, region)
File "/var/task/automanager_notification.py", line 63, in notify_slack
notification = cloudwatch_notification(message, region)
File "/var/task/automanager_notification.py", line 22, in cloudwatch_notification
"color": states[message['NewStateValue']],
TypeError: string indices must be integers

I am worthless when it comes to python, so I don't have a clue as to how to solve this one. When publishing plain text to this topic, it does work.

@yujunz
Copy link

yujunz commented Jan 9, 2019

Met similar error for CloudWatch Alarm

string indices must be integers: TypeError
Traceback (most recent call last):
File "/var/task/notify_slack.py", line 78, in lambda_handler
notify_slack(message, region)
File "/var/task/notify_slack.py", line 63, in notify_slack
notification = cloudwatch_notification(message, region)
File "/var/task/notify_slack.py", line 22, in cloudwatch_notification
"color": states[message['NewStateValue']],
TypeError: string indices must be integers

@fernandrone
Copy link

Same issue!

Reverting to previous version seems to have fixed it:

source  = "terraform-aws-modules/notify-slack/aws"
version = "1.10.0"

This seems to be the problem: #23

It's not parsing the message from a json to a python dictionary anymore, but it is accessing the object as if it were a dictionary here: https://github.com/terraform-aws-modules/terraform-aws-notify-slack/blob/master/functions/notify_slack.py#L22

Not sure why the change was made though. There's nothing on the PR nor release notes.

@fernandrone
Copy link

@antonbabenko @ryron01 maybe you can shed some light on this?

I was going to fork this and open a PR reverting #23, but that's the only commit in the latest release. If this is really the issue then deleting release 1.11 instead might be the best option.

@antonbabenko
Copy link
Member

If I understand correctly and I ran it myself to test the fix introduced in #23, it was necessary.

Additionally, we need to fix the error TypeError: string indices must be integers as described in the current issue.

Anyone?

@fernandrone
Copy link

I think the error is what I pointed out in my previous comment:

It's not parsing the message from a json to a python dictionary anymore, but it is accessing the object as if it were a dictionary here: https://github.com/terraform-aws-modules/terraform-aws-notify-slack/blob/master/functions/notify_slack.py#L22

So at line 22, messsage is of type "string", not "dict". Thus accessing it as message['NewStateValue'] is invalid.

If I understand correctly and I ran it myself to test the fix introduced in #23, it was necessary.

Just out of curiosity, what was that fixing?

I reverted to 1.10.0 on my environments and its working fine, so I guess the fix might have been targeted at a different use case than mine.

@nao-kagayaki
Copy link

Hi, this is just sharing.
I use the version 1.10.0 and I just got a capture of the error in my environment that #23 tried to fix, I guess.

2019-01-12 8 18 02

But not all lambda execution failed but in some condition if my understanding is correct.
I haven't yet done debugging and I am wondering what went wrong because I peeped into another repo that use the same processing.
https://github.com/awslabs/serverless-application-model/blob/master/examples/apps/cloudwatch-alarm-to-slack-python3/lambda_function.py

@ryron01
Copy link
Contributor

ryron01 commented Jan 12, 2019

@nyamada2 is correct that is the error that #23 was fixing. In the incoming event, event['Records'][0]['Sns']['Message'] is not valid json in my case. See the example below.

If someone can provide an example AWS event that includes [message['NewStateValue']] I'd be willing to take a look at it.

{ "Records": [ { "EventSource": "aws:sns", "EventVersion": "1.0", "EventSubscriptionArn": "arn:aws:sns:us-west-2:{{accountId}}:ExampleTopic", "Sns": { "Type": "Notification", "MessageId": "95df01b4-ee98-5cb9-9903-4c221d41eb5e", "TopicArn": "arn:aws:sns:us-west-2:123456789012:ExampleTopic", "Subject": "example subject", "Message": "example message", "Timestamp": "1970-01-01T00:00:00.000Z", "SignatureVersion": "1", "Signature": "EXAMPLE", "SigningCertUrl": "EXAMPLE", "UnsubscribeUrl": "EXAMPLE", "MessageAttributes": { "Test": { "Type": "String", "Value": "TestString" }, "TestBinary": { "Type": "Binary", "Value": "TestBinary" } } } } ] }

ryron01 pushed a commit to ryron01/terraform-aws-notify-slack that referenced this issue Jan 12, 2019
@akurtasinski
Copy link

akurtasinski commented Jan 15, 2019

Full event from CloudWatch:

{'Records': [{'EventSource': 'aws:sns', 'EventVersion': '1.0', 'EventSubscriptionArn': 'EventSubscriptionArn', 'Sns': {'Type': 'Notification', 'MessageId': 'MessageId', 'TopicArn': 'TopicArn', 'Subject': 'ALARM: "ecs-service-CPUUtilization" in EU (Ireland)', 'Message': '{"AlarmName":"ecs-service-CPUUtilization","AlarmDescription":"This alarm triggers when service CPU utilization is high. Created by Terraform.","AWSAccountId":"AWSAccountId","NewStateValue":"ALARM","NewStateReason":"Threshold Crossed: 1 datapoint [1.9185895938970887 (15/01/19 10:48:00)] was greater than or equal to the threshold (1.0).","StateChangeTime":"2019-01-15T10:49:59.967+0000","Region":"EU (Ireland)","OldStateValue":"OK","Trigger":{"MetricName":"CPUUtilization","Namespace":"AWS/ECS","StatisticType":"Statistic","Statistic":"AVERAGE","Unit":null,"Dimensions":[{"value":"ecs-service","name":"ServiceName"},{"value":"ecs-cluster","name":"ClusterName"}],"Period":60,"EvaluationPeriods":1,"ComparisonOperator":"GreaterThanOrEqualToThreshold","Threshold":1.0,"TreatMissingData":"- TreatMissingData: Ignore","EvaluateLowSampleCountPercentile":""}}', 'Timestamp': '2019-01-15T10:49:59.980Z', 'SignatureVersion': '1', 'Signature': ’SIGNATURE', 'SigningCertUrl': 'SigningCertUrl', 'UnsubscribeUrl': 'UnsubscribeUrl', 'MessageAttributes': {}}}]}

So in my case event['Records'][0]['Sns']['Message'] is valid json, so reverting #23 fixes problem.

@fernandrone
Copy link

fernandrone commented Jan 16, 2019

Hi @ryron01

I'm not sure I got it right, but your example is a generic example message. This module documentation says it supports CloudWatch Events only, and it seems that every CW event payload is a json (if someone has an example where it isn't that'd be helpful).

And if it's not a json, then the rest of the module will fail anyway, wouldn't it?

Anyway, wouldn't the best solution to simply try to load the message as a json, catch any JSONDecodeError and discard it if it fails?

EDIT: Oh, I see you already developed a solution along these lines here #25 👍

@PatrickMans
Copy link
Author

I would love to post the complete message here, as being sent by SNS towards lambda... but.. how do I get the message from the log?

@grealish
Copy link

This issue is also affecting us too, both for 1.10.0 and 1.11.0
same error as above when setting up NetworkELB and VPN state alerts.

@yujunz
Copy link

yujunz commented Jan 21, 2019

@ryron01 's patch works for me.

@antonbabenko
Copy link
Member

v1.13.0 has been released with similar fix.

@github-actions
Copy link

github-actions bot commented Nov 9, 2022

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Nov 9, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants