Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: "Your socket connection to the server was not read from or written to within the timeout period." #29

Closed
schmittjoh opened this issue Jan 12, 2013 · 12 comments
Labels
guidance Question that needs advice or information.

Comments

@schmittjoh
Copy link

On some system, I get the above error when trying to upload a file to Amazon S3.

Triggering Code:

UploadBuilder::newInstance()
    ->setClient($this->s3)
    ->setSource($url)
    ->setBucket($this->bucketName)
    ->setKey($key)
    ->build()
    ->upload()
;

The error can be prevented by switching the source to be in memory:

UploadBuilder::newInstance()
    // ...
    // Pass the content instead of just the URL
    ->setSource(EntityBody::factory($content))
    // ...
;

Maybe you have seen this before, and have an idea how to fix this automatically?

Complete Stack Trace:

Aws\Common\Exception\MultipartUploadException: An error was encountered while performing a multipart upload: Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed.

vendor/aws/aws-sdk-php/src/Aws/Common/Model/MultipartUpload/AbstractTransfer.php:177
vendor/aws/aws-sdk-php/src/Aws/Common/Exception/NamespaceExceptionFactory.php:75
vendor/aws/aws-sdk-php/src/Aws/Common/Exception/ExceptionListener.php:55
vendor/symfony/event-dispatcher/Symfony/Component/EventDispatcher/EventDispatcher.php:164
vendor/symfony/event-dispatcher/Symfony/Component/EventDispatcher/EventDispatcher.php:53
vendor/guzzle/guzzle/src/Guzzle/Http/Message/Request.php:757
vendor/guzzle/guzzle/src/Guzzle/Http/Message/Request.php:466
vendor/guzzle/guzzle/src/Guzzle/Http/Message/EntityEnclosingRequest.php:66
vendor/guzzle/guzzle/src/Guzzle/Http/Curl/CurlMulti.php:499
vendor/guzzle/guzzle/src/Guzzle/Http/Curl/CurlMulti.php:426
vendor/guzzle/guzzle/src/Guzzle/Http/Curl/CurlMulti.php:387
vendor/guzzle/guzzle/src/Guzzle/Http/Curl/CurlMulti.php:278
vendor/guzzle/guzzle/src/Guzzle/Http/Client.php:363
vendor/guzzle/guzzle/src/Guzzle/Service/Client.php:223
vendor/guzzle/guzzle/src/Guzzle/Service/Command/AbstractCommand.php:167
vendor/guzzle/guzzle/src/Guzzle/Service/Command/AbstractCommand.php:206
vendor/aws/aws-sdk-php/src/Aws/S3/Model/MultipartUpload/SerialTransfer.php:73
vendor/aws/aws-sdk-php/src/Aws/Common/Model/MultipartUpload/AbstractTransfer.php:167

Caused by
Aws\S3\Exception\RequestTimeoutException: Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed.

vendor/aws/aws-sdk-php/src/Aws/Common/Exception/NamespaceExceptionFactory.php:89
vendor/aws/aws-sdk-php/src/Aws/Common/Exception/NamespaceExceptionFactory.php:75
vendor/aws/aws-sdk-php/src/Aws/Common/Exception/ExceptionListener.php:55
vendor/symfony/event-dispatcher/Symfony/Component/EventDispatcher/EventDispatcher.php:164
vendor/symfony/event-dispatcher/Symfony/Component/EventDispatcher/EventDispatcher.php:53
vendor/guzzle/guzzle/src/Guzzle/Http/Message/Request.php:757
vendor/guzzle/guzzle/src/Guzzle/Http/Message/Request.php:466
vendor/guzzle/guzzle/src/Guzzle/Http/Message/EntityEnclosingRequest.php:66
vendor/guzzle/guzzle/src/Guzzle/Http/Curl/CurlMulti.php:499
vendor/guzzle/guzzle/src/Guzzle/Http/Curl/CurlMulti.php:426
vendor/guzzle/guzzle/src/Guzzle/Http/Curl/CurlMulti.php:387
vendor/guzzle/guzzle/src/Guzzle/Http/Curl/CurlMulti.php:278
vendor/guzzle/guzzle/src/Guzzle/Http/Client.php:363
vendor/guzzle/guzzle/src/Guzzle/Service/Client.php:223
vendor/guzzle/guzzle/src/Guzzle/Service/Command/AbstractCommand.php:167
vendor/guzzle/guzzle/src/Guzzle/Service/Command/AbstractCommand.php:206
vendor/aws/aws-sdk-php/src/Aws/S3/Model/MultipartUpload/SerialTransfer.php:73
vendor/aws/aws-sdk-php/src/Aws/Common/Model/MultipartUpload/AbstractTransfer.php:167
@mtdowling
Copy link
Member

What's the size of the file you're uploading?

Does this error happen consistently for particular files on multiple systems?

On Jan 12, 2013, at 1:40 AM, Johannes [email protected] wrote:

On some system, I get the above error when trying to upload a file to Amazon S3.

Triggering Code:

UploadBuilder::newInstance()
->setClient($this->s3)
->setSource($url)
->setBucket($this->bucketName)
->setKey($key)
->build()
->upload()
;
The error can be prevented by switching the source to be in memory:

UploadBuilder::newInstance()
// ...
// Pass the content instead of just the URL
->setSource(EntityBody::factory($content))
// ...
;
Maybe you have seen this before, and have an idea how to fix this automatically?

Complete Stack Trace:

Aws\Common\Exception\MultipartUploadException: An error was encountered while performing a multipart upload: Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed.

vendor/aws/aws-sdk-php/src/Aws/Common/Model/MultipartUpload/AbstractTransfer.php:177
vendor/aws/aws-sdk-php/src/Aws/Common/Exception/NamespaceExceptionFactory.php:75
vendor/aws/aws-sdk-php/src/Aws/Common/Exception/ExceptionListener.php:55
vendor/symfony/event-dispatcher/Symfony/Component/EventDispatcher/EventDispatcher.php:164
vendor/symfony/event-dispatcher/Symfony/Component/EventDispatcher/EventDispatcher.php:53
vendor/guzzle/guzzle/src/Guzzle/Http/Message/Request.php:757
vendor/guzzle/guzzle/src/Guzzle/Http/Message/Request.php:466
vendor/guzzle/guzzle/src/Guzzle/Http/Message/EntityEnclosingRequest.php:66
vendor/guzzle/guzzle/src/Guzzle/Http/Curl/CurlMulti.php:499
vendor/guzzle/guzzle/src/Guzzle/Http/Curl/CurlMulti.php:426
vendor/guzzle/guzzle/src/Guzzle/Http/Curl/CurlMulti.php:387
vendor/guzzle/guzzle/src/Guzzle/Http/Curl/CurlMulti.php:278
vendor/guzzle/guzzle/src/Guzzle/Http/Client.php:363
vendor/guzzle/guzzle/src/Guzzle/Service/Client.php:223
vendor/guzzle/guzzle/src/Guzzle/Service/Command/AbstractCommand.php:167
vendor/guzzle/guzzle/src/Guzzle/Service/Command/AbstractCommand.php:206
vendor/aws/aws-sdk-php/src/Aws/S3/Model/MultipartUpload/SerialTransfer.php:73
vendor/aws/aws-sdk-php/src/Aws/Common/Model/MultipartUpload/AbstractTransfer.php:167

Caused by
Aws\S3\Exception\RequestTimeoutException: Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed.

vendor/aws/aws-sdk-php/src/Aws/Common/Exception/NamespaceExceptionFactory.php:89
vendor/aws/aws-sdk-php/src/Aws/Common/Exception/NamespaceExceptionFactory.php:75
vendor/aws/aws-sdk-php/src/Aws/Common/Exception/ExceptionListener.php:55
vendor/symfony/event-dispatcher/Symfony/Component/EventDispatcher/EventDispatcher.php:164
vendor/symfony/event-dispatcher/Symfony/Component/EventDispatcher/EventDispatcher.php:53
vendor/guzzle/guzzle/src/Guzzle/Http/Message/Request.php:757
vendor/guzzle/guzzle/src/Guzzle/Http/Message/Request.php:466
vendor/guzzle/guzzle/src/Guzzle/Http/Message/EntityEnclosingRequest.php:66
vendor/guzzle/guzzle/src/Guzzle/Http/Curl/CurlMulti.php:499
vendor/guzzle/guzzle/src/Guzzle/Http/Curl/CurlMulti.php:426
vendor/guzzle/guzzle/src/Guzzle/Http/Curl/CurlMulti.php:387
vendor/guzzle/guzzle/src/Guzzle/Http/Curl/CurlMulti.php:278
vendor/guzzle/guzzle/src/Guzzle/Http/Client.php:363
vendor/guzzle/guzzle/src/Guzzle/Service/Client.php:223
vendor/guzzle/guzzle/src/Guzzle/Service/Command/AbstractCommand.php:167
vendor/guzzle/guzzle/src/Guzzle/Service/Command/AbstractCommand.php:206
vendor/aws/aws-sdk-php/src/Aws/S3/Model/MultipartUpload/SerialTransfer.php:73
vendor/aws/aws-sdk-php/src/Aws/Common/Model/MultipartUpload/AbstractTransfer.php:167

Reply to this email directly or view it on GitHub.

@schmittjoh
Copy link
Author

The size is about 11kb, and it's the same file for both systems.

On one system (PHP 5.4.9, Ubuntu 12.04), it consistently fails with the above error, on the other system (PHP 5.4.6, Ubuntu 12.10) it consistently works.

@mtdowling
Copy link
Member

Ah interesting. The UploadBuilder is used for multipart uploads. The size of a multipart upload file for Amazon S3 has to be greater than or equal to 5MB (see http://docs.amazonwebservices.com/AmazonS3/latest/dev/qfacts.html). If this is in fact the error, we will see about adding a check to ensure the size of the file meets the minimum size of the uploader.

@schmittjoh
Copy link
Author

It still puzzles me a bit that the uploader works when constructing an in memory source explicitly, do you have an idea why that is?

Another thought, couldn't the uploader just choose the mechanism that is suitable for the given source?

@mtdowling
Copy link
Member

What does the variable $url contain? A string that contains a URL (e.g. 'http://www.example.com'), a resource returned by fopen('http://example.com', 'r'), an EntityBody object, or the path to a file? If it's the path to a file, is it possible that the file is frequently being changed and fstat cache isn't picking up size changes in the file, thus reporting an erroneous filesize()?

@schmittjoh
Copy link
Author

$url is a URL, i.e. (file://some-path), which can be used with
fopen/file_get_contents.

The contents of this URL do not change anymore although they might have
changed before calling the upload builder.

On Fri, Feb 1, 2013 at 1:25 AM, Michael Dowling [email protected]:

What does the variable $url contain? A string that contains a URL (e.g. '
http://www.example.com'), a resource returned by fopen('http://example.com',
'r'), an EntityBody object, or the path to a file? If it's the path to a
file, is it possible that the file is frequently being changed and fstat
cache http://php.net/manual/en/function.clearstatcache.php isn't
picking up size changes in the file, thus reporting an erroneous filesize()?


Reply to this email directly or view it on GitHubhttps://github.com//issues/29#issuecomment-12974403.

@mtdowling
Copy link
Member

The reason this is happening is because cURL is being told that it is going to send more data than it actually has. I'm not sure where the erroneous length is being specified, but clearing fstat cache should help to narrow down the possibilities. Does this still fail for you after the change I made to clear the fstat cache?

@RamyTalal
Copy link

I'm having the same issue when uploading an image to S3. I'm using the putObject method. I tried 'Body => file_get_contents('thefile') and 'SourceFile' => '/path/to/the/file.ext'.

The issue occurs randomly.

(Running on PHP 5.4.10 and Centos 6)

@mtdowling
Copy link
Member

@RamyTalal What version of PHP are you using, what version of cURL, and what arch is your system (i386, x86_64)?

uname -a
php --version
curl --version

@RamyTalal
Copy link

@mtdowling

2.6.18-308.8.2.el5.028stab101.1 #1 SMP Sun Jun 24 20:25:35 MSD 2012 x86_64 x86_64 x86_64

PHP 5.4.10

curl 7.19.7 (x86_64-redhat-linux-gnu) libcurl/7.19.7 NSS/3.13.1.0 zlib/1.2.3 libidn/1.18 libssh2/1.2.2
Protocols: tftp ftp telnet dict ldap ldaps http file https ftps scp sftp 
Features: GSS-Negotiate IDN IPv6 Largefile NTLM SSL libz

@mtdowling
Copy link
Member

It seems that this error occurs intermittently when uploading to Amazon S3. I've updated the Amazon S3 client to automatically retry these specific failures using exponential backoff.

If you continue to see this issue, please ensure that you are not sending an incorrect Content-Length header in your requests.

poisa pushed a commit to poisa/aws-sdk-php that referenced this issue Mar 14, 2013
@gwilym
Copy link
Contributor

gwilym commented Jul 22, 2013

I've been noticing this a lot when using concurrency > 1, on several systems. Most of the time an uploaded folder will begin to get 400's due to timeouts after the first 10 or so requests.

I did some superficial investigation into this (PHP userland layers at the lowest) and near as I can tell this may be a cURL-related bug. I certainly can't see the Guzzle library doing anything wrong.

Despite giving cURL the correct content lengths, occasionally the request body reading function given to cURL by Guzzle will get called a second time after correctly signalling EOF with a blank string response.

This is always followed by a 20 second stop and eventually a timeout.

e.g. during PUTs in Guzzle RequestMediator ::readRequestBody (remember that curl multi is being used here so the jumbled output is just the "parallel" curl handle execution)

<file> ~ <read-length>

https://<bucket>.s3.amazonaws.com/<prefix>/overview_hero.jpg ~ 16384
https://<bucket>.s3.amazonaws.com/<prefix>/overview_interior.jpg ~ 16384
https://<bucket>.s3.amazonaws.com/<prefix>/p-s-win-portmans095.jpg ~ 16384
https://<bucket>.s3.amazonaws.com/<prefix>/overview_hero.jpg ~ 16384
https://<bucket>.s3.amazonaws.com/<prefix>/overview_interior.jpg ~ 1304
https://<bucket>.s3.amazonaws.com/<prefix>/p-s-win-portmans095.jpg ~ 16384
https://<bucket>.s3.amazonaws.com/<prefix>/overview_hero.jpg ~ 16384
https://<bucket>.s3.amazonaws.com/<prefix>/overview_interior.jpg ~ 0
https://<bucket>.s3.amazonaws.com/<prefix>/p-s-win-portmans095.jpg ~ 16384
https://<bucket>.s3.amazonaws.com/<prefix>/overview_hero.jpg ~ 16384
https://<bucket>.s3.amazonaws.com/<prefix>/p-s-win-portmans095.jpg ~ 16384
https://<bucket>.s3.amazonaws.com/<prefix>/overview_hero.jpg ~ 16384
https://<bucket>.s3.amazonaws.com/<prefix>/overview_hero.jpg ~ 3348
https://<bucket>.s3.amazonaws.com/<prefix>/overview_hero.jpg ~ 0
https://<bucket>.s3.amazonaws.com/<prefix>/overview_interior.jpg ~ 0

(delay begins)

Note that last line has been hit twice despite the first one returning a 0-length string. My gut feeling is cURL is getting into a state where it's expecting more data and so it never concludes the request, though I can't say for sure if libcurl itself is at the root of this.

PHP 5.3.26-1~dotdeb.0 with Suhosin-Patch (cli) (built: Jun 9 2013 03:35:34)
cURL Information => 7.21.0

I'm not proposing a re-open of this because I can't be 100% sure, just leaving my findings behind. Someone may find my info handy if they try to pick it up later. Personally I don't know enough C to go digging around in the PHP or libcurl source.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
guidance Question that needs advice or information.
Projects
None yet
Development

No branches or pull requests

4 participants