Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

golang deadlock with s3object #356

Closed
vinhltr opened this issue Oct 1, 2024 · 6 comments
Closed

golang deadlock with s3object #356

vinhltr opened this issue Oct 1, 2024 · 6 comments
Labels
bug Something isn't working

Comments

@vinhltr
Copy link

vinhltr commented Oct 1, 2024

I'm having an issue with regards to S3Object just this past week. The error initially was just a stuck nuke process, no error, even with trace log enabled. After doing some more digging, it turns out to be a specific issue in-us-east-1, and with a specific log bucket.

In my sandbox env, there is an access log bucket with roughly 67k objects (with total bucket size <50MB), this is from a single bucket. I'd estimate the grand total for all existing buckets would be roughly 70k objects. This causes a golang deadlock error when iterating over the S3Object resources. I then tried to empty the access log bucket, and retry the aws-nuke cli again, and it worked.

Screenshot 2024-10-01 at 8 50 44 AM

Some more context, my config regarding S3Object does not filter anything. And this is also the first time I've seen this issue, and I've been using aws-nuke (old repo & this fork) for almost a year now.

version: 3.24.0

@vinhltr vinhltr changed the title Error gotang deadlock when iterating over large amount of S3Objects Error golang deadlock when iterating over large amount of S3Objects Oct 1, 2024
@ekristen
Copy link
Owner

ekristen commented Oct 1, 2024

Thanks @vinhltr I'll have to look into this more. Do you actually use S3Object on purpose. 99% of the time folks really just want S3Bucket. I'm inclined to disable S3Object by default as it's very problematic.

@ekristen ekristen added the bug Something isn't working label Oct 1, 2024
@vinhltr
Copy link
Author

vinhltr commented Oct 1, 2024

@ekristen My understanding is to nuke an s3 bucket, I have to wipe it first, which means deleting all s3objects, so yes, I'm using S3Object on purpose.

@ekristen
Copy link
Owner

ekristen commented Oct 1, 2024

You aren't wrong, but S3Bucket wipes an entire bucket clean much more efficiently using bulk api calls. S3Object should really just always be disabled.

Exclude S3Object in your configuration. You'll end up with the same result and it will be much quicker.

@vinhltr
Copy link
Author

vinhltr commented Oct 1, 2024

noted, I'll make adjustment in my code, thanks @ekristen

@ekristen ekristen changed the title Error golang deadlock when iterating over large amount of S3Objects golang deadlock with s3object Oct 1, 2024
@ikarlashov
Copy link

I actually had the same. aws-nuke didn't nuke any resources. Then I excluded S3Object and resources were deleted. Weird :-/

@ekristen
Copy link
Owner

ekristen commented Oct 14, 2024

Closing. S3Object will be disable by default in the 4.x release of the tool. See #380

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants