Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Supported chunked behavior for responses #176

Merged
merged 5 commits into from
Sep 29, 2020

Conversation

matthiasblaesing
Copy link
Contributor

Server send events are send as chunked responses. The expectation is
that the chunks are immediately send and not buffered until the whole
response is done.

If a response is detected, the chunks are normally read and each read
chunk is immediately flushed, replicating the upstream behavior
downstream.

Server send events are send as chunked responses. The expectation is
that the chunks are immediately send and not buffered until the whole
response is done.

If a response is detected, the chunks are normally read and each read
chunk is immediately flushed, replicating the upstream behavior
downstream.
@matthiasblaesing matthiasblaesing mentioned this pull request Sep 23, 2020
@dsmiley
Copy link
Collaborator

dsmiley commented Sep 23, 2020

Wouldn't it be the job of the HttpEntity to do this in writeto() if indeed this is any better/faster? Put differently, maybe you should take this up with Apache Httpcomponents?

@matthiasblaesing
Copy link
Contributor Author

The change does not make it faster, indeed it may be slower, as smaller packets are send. The important part is, that for server send events the packets need to be forwarded immediately, else the stream degenerates to a normal http request and just blocks the browser.

This is a special case, that is only valid for the proxy use-case. If the stream is for example a download with an unknown length and is streamed to a file a flush is not necessary. So I would argue, that the proxy servlet is the special case here.

@dsmiley
Copy link
Collaborator

dsmiley commented Sep 27, 2020

Can you add a test please? AFAICT the existing tests don't exercise this code.

@matthiasblaesing
Copy link
Contributor Author

For the test: Sure, but how far can I go? httpunit is not adequate for this check. If I understand correctly, it emulates a servlet container and this emulation is leaky. A response is only created/made accessible, after a request is finished executing. A test in this case becomes absurd, as the hole point is reading data before service finishes.

So can I add jetty to the test dependencies?

@dsmiley
Copy link
Collaborator

dsmiley commented Sep 28, 2020

Yes, let's add Jetty. The existing test infrastructure has been holding back testing some things that need a real webserver (like Jetty). Also I'm willing to change the project dependencies to Java 1.8+ if that helps.

Supporting changes:
- add jetty as a test dependency used as servlet execution engine
- bump javax.servlet-api to version 3.1.0 which is required by jetty

Jetty 9.4 requires java 8 as minimum version.
@matthiasblaesing
Copy link
Contributor Author

I pushed an update with a unittest. I verified, that the test fails without the change and runs clean with it.

@dsmiley dsmiley merged commit 22d4420 into mitre:master Sep 29, 2020
@dsmiley
Copy link
Collaborator

dsmiley commented Sep 29, 2020

Thanks for writing this test!

@matthiasblaesing
Copy link
Contributor Author

Thanks for taking time to get this in and maintaining the proxy servlet 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants