Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sync fails with a large number of files to remove #435

Closed
kbarrette opened this issue Apr 11, 2022 · 2 comments · Fixed by #438
Closed

sync fails with a large number of files to remove #435

kbarrette opened this issue Apr 11, 2022 · 2 comments · Fixed by #438
Labels

Comments

@kbarrette
Copy link

sync exits with status 1 and no error message when given a situation that results in a large number of deletions:

Run in an empty directory, which will cause around 500k deletions:

❯ s5cmd --log=debug --stat sync --delete . s3://my-bucket/lots-of-keys/

Operation	Total	Error	Success
sync		1	1	0

❯ echo $status
1

The same behavior also exists when running with the --dry-run flag.

Running the same command against an S3 path that results in 1 deletion works normally.

I realize this isn't a lot of information to work with - please let me know if I can provide more help somehow.

Thanks!

@kbarrette
Copy link
Author

Note to anyone who finds this: I'm currently working around this by using two operations:

  1. s5cmd cp to copy files, which takes advantage of s5cmd's speed
  2. aws s3 sync with the official AWS CLI, which does the necessary deletions

As far as I can tell, this is faster than just aws s3 sync alone, but obviously I'd rather be using just one tool.

@sonmezonur
Copy link
Member

Could you please try this setup in new release (v2.0.0-beta.2)? The issue should be resolved now. Feel free to reopen issue if you are still getting this behavior.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants