Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Backpressure on http.ServerResponse #10256

Closed
davedoesdev opened this issue Dec 13, 2016 · 7 comments
Closed

Backpressure on http.ServerResponse #10256

davedoesdev opened this issue Dec 13, 2016 · 7 comments
Labels
question Issues that look for answers. stream Issues and PRs related to the stream subsystem.

Comments

@davedoesdev
Copy link
Contributor

davedoesdev commented Dec 13, 2016

  • Version: v6.9.2
  • Platform: Linux david-Latitude-E6440 4.8.0-27-generic docs: added a the #29-Ubuntu SMP Thu Oct 20 21:03:13 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux
  • Subsystem: http, stream

Here's a test program:

var http = require('http'),
    PassThrough = require('stream').PassThrough,
    server = http.createServer();

server.on('request', function (req, res)
{
    console.log('REQUEST');
    
    var pthru = new PassThrough();

    pthru.on('end', function ()
    {
        console.log("PASSTHROUGH END");
    });

    pthru.pipe(res);

    pthru.end(new Buffer(1024*1024));
});

server.listen(8700, function ()
{
    http.request(
    {
        port: 8700,
        method: 'GET'
    }, function (res)
    {
        console.log('RESPONSE');

        res.on('end', function ()
        {
            console.log("RESPONSE END");
        });
    }).end();
});

I expected it to display:

REQUEST
RESPONSE

because the client-side isn't reading from the response so back-pressure should be applied via res to pthru.

However, it actually displays:

REQUEST
PASSTHROUGH END
RESPONSE

I think maybe I'm missing something, or is this a bug?

@mscdex mscdex added question Issues that look for answers. stream Issues and PRs related to the stream subsystem. labels Dec 13, 2016
@sam-github
Copy link
Contributor

you don't have enough data for pressure. The get request has an empty message body! Also, the client may not be reading yet, but the client TCP stack is. It will read enough data to fill its buffer, at which point data will stop flowing until the client app (node) reads data from the TCP stack. In this case, you have no data to speak of, so it all goes across TCP and is buffered in the client stack. Do a giant, giant POST and you may see what you were hopeing for.

@davedoesdev
Copy link
Contributor Author

I'm only testing back-pressure in the response here, not in the request.
How much is giant?

@davedoesdev
Copy link
Contributor Author

davedoesdev commented Dec 13, 2016

Hmm, it seems like a gigabyte (10003) is the magic number. Above that, I don't get PASSTHROUGH_END.
Are the TCP buffers really that big?

@davedoesdev
Copy link
Contributor Author

I think it's also the way the pipe's working. If I write the Gb in one go, I get PASSTHROUGH END displayed. If I write it in two chunks, each half a Mb then I don't get PASSTHROUGH END displayed.

I guess this is because writing in two chunks gives the pipe a chance (after the first write) to notice that res is full and therefore pause.

@davedoesdev
Copy link
Contributor Author

... which means the TCP buffers are lower than Gb and I need to do some more testing for the figure on my system.

@sam-github thanks for the pointer.

@sam-github
Copy link
Contributor

Localhost TCP buffers might be incredibly high, because on localhost the data buffering cost is all on your host, doesn't matter if its the sender's or the receiver's buffers. I'm not saying it is... but localhost TCP does bypass some TCP protocol lower layers as irrelevant when not running over a network.

You could try over a network. Also, I've never seen the passthrough stream, it could be you are just observing and oddity of it.

@davedoesdev
Copy link
Contributor Author

If I write 16MB in 2 chunks then I get the backpressure. 16MB seems reasonable.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Issues that look for answers. stream Issues and PRs related to the stream subsystem.
Projects
None yet
Development

No branches or pull requests

3 participants