-
-
Notifications
You must be signed in to change notification settings - Fork 145
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Streaming request body parsing #41
Conversation
I saw |
Not yet, those are also in the works though. But this is a step in that direction. The |
The HTTP message body can potentially have any size (even single fields can be huge), do we really want to store this automatically? As an alternative, nodejs only exposes the body stream and leaves it up to the consumer to pass this stream to the correct parser (for example http://stackoverflow.com/questions/4295782/how-do-you-extract-post-data-in-node-js). |
No I want to go nodes way where a string or stream is emitted? Those can just be gathered using the buffered sink. Not sure yet about the stream part. It would result in less overhead for small fields but more code on the implementing side. |
My personal vote here would be small, independent, composable parts instead of convenience built in. Composable parts enable convenience on a higher level, such as this draft API: $http->on('request', function (Request $request, Response $response) use ($formParser) {
$formParser->parseDeferredStream($request)->then(function ($fields) use ($response) {
$response->end('hello ' . $field['name']);
}, function ($error) use ($response) {
$response->writeHead(400);
});
}); Once we look into PSR-7 support, we could probably build a convenient middleware around this concept in order to make this available to each request handler. |
That looks good, I'm assuming that is just listening to |
This exactly 👍
Yeah, I too suppose this should be easier than auto-wiring all parsers 👍 |
Thanks for clarifying that @clue, working on that refactor right now 👍 |
@clue what I'm working out now would have this API approximately: $http->on('request', function (Request $request, Response $response) {
FormParserFactory::create($request)->deferredStream()->then(function ($fields) use ($response) {
$response->end('hello ' . $field['name']);
}, function ($error) use ($response) {
$response->writeHead(400);
});
}); Currently decoupling all that auto-wiring code |
The last few commits remove the right wiring between the form parsers and request parser. They also add a form parser factory. Next step is adding methods like |
Yeah Github won't even let me merge it from the site 😝 . This PR is my top ReactPHP priority at the moment, would prefer to get reactphp/http-client#58 in A.S.A.P. so I can fully focus on this right here. I'll discus with @clue how exactly we're going to cut it up, but since most of the discussion has already taken place here we can move relatively quick |
Hello gents, if I may, some remarks about the adopted design : Input:
Suggestions : Output: Regards. |
Been spending some time splitting this PR up into smaller ones, that resulted in the following pull requests: -----------#62: Uploaded File object that doesn't make sense on it's own but it provides something needed by #72 and #73. -----------Order of merging (all PR's will be squashed on merge keeping the history clear): #69, #62, #70, #73, #71, #72 -----------Will go over @moe123's comment carefully and see where adjustment is necessary. One of the things I've already done due to @moe123's is make all the parsers cancelable. |
@andig yes #62 and #69 are done as far as I'm concerned, unless @jsor or @clue thinks otherwise. And I like to get them in soon, I'll ping them on IRC tonight and see how they look at it. Once that is in, start working on completing the other PR's. One of the issues I came across is that the urlencoded parser (#71) is going to be interesting as I can't use build in PHP functions to do the parsing without buffering. |
Can't we assume- for the time being- that buffering for this case is ok, i.e. you either have a POST blob which doesn't need decoding or you have urlencoded data that will most likely not exceed a certain size? |
We could do that, I've set up several milestones that allows us to release this in parts. For example first getting the foundation out in |
Any update? |
@bweston92 See this issue for our roadmap: #120 |
@@ -28,27 +37,23 @@ public function feed($data) | |||
|
|||
// Extract the header from the buffer | |||
// in case the content isn't complete | |||
list($headers, $this->buffer) = explode("\r\n\r\n", $this->buffer, 2); | |||
list($headers, $buffer) = explode("\r\n\r\n", $this->buffer, 2); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This might result in a large string operation. Better use the previous strpos
and check that against the $this->maxSize
before. You might also want to use substr
as you already have the position then.
This PR is the follow up for #13. It started out to make multipart streaming but ended up making all bodies streaming.
The parsers emit a
post
event with the key and value of a post variable andfile
on uploaded files found in the request. On the request objectgetFiles
is gone due to the streaming nature of the parsers.getPost
is still there but it won't have everything until the entire request has been parsed.Todo: