You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Feb 12, 2024. It is now read-only.
Hello,
I have FakeS3 setup on Ubuntu 14.04 LTS 64 bit
I have an S3 application developed using the AWS S3 C++ API. Using this application I am able to successfully upload files to S3 but I'm now getting to the point where I'd like to test with some REALLY big files. For cost (and time) reasons I'd like to use FakeS3.
I am starting FakeS3 on the ubuntu box using these arguments: fakes2 -r /home/dev/fakes3_root/ -p 4567 -H fakes3.local
I am able to upload successfully to S3 a 16MB and a 500MB file. Each of these uploads uses the S3 multipart API and succeeds. When I switch to FakeS3 the 16MB file uploads successfully but FakeS3 appears to fail with the 500MB file. For info: the parts are uploaded in 5MB blocks.
Here's the output from S3 when it appears to go wrong. Any help appreciated. create_500MB_object.txt
Also works with a 100MB, and a 201MB file. Fails with 301MB file.
Regards,
Paul
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Hello,
I have FakeS3 setup on Ubuntu 14.04 LTS 64 bit
I have an S3 application developed using the AWS S3 C++ API. Using this application I am able to successfully upload files to S3 but I'm now getting to the point where I'd like to test with some REALLY big files. For cost (and time) reasons I'd like to use FakeS3.
I am starting FakeS3 on the ubuntu box using these arguments:
fakes2 -r /home/dev/fakes3_root/ -p 4567 -H fakes3.local
I am able to upload successfully to S3 a 16MB and a 500MB file. Each of these uploads uses the S3 multipart API and succeeds. When I switch to FakeS3 the 16MB file uploads successfully but FakeS3 appears to fail with the 500MB file. For info: the parts are uploaded in 5MB blocks.
Here's the output from S3 when it appears to go wrong. Any help appreciated.
create_500MB_object.txt
Also works with a 100MB, and a 201MB file. Fails with 301MB file.
Regards,
Paul
The text was updated successfully, but these errors were encountered: