-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Chunked Responses for Batch requests #396
Comments
@gauravmk I think that's a great idea. You'd have to make sure it works with Apollo Client's network interface though, because it's no use if the response is chunked but Apollo Client doesn't get the result until the entire response is back. Maybe you could start with a PR to Apollo Server, and then one to Apollo Client to support the chunking on both ends of the transport? On the server there should be an option to turn it on or off, but on the client it can always be supported, I think. |
Yep, I got a reference of it working in our codebase using a custom network interface. I got our server to respect a It uses the new fetch api but there's mixed support for it. Chrome natively supports and there are polyfills for some browsers, but others legitimately have no way of flushing partial responses predictably. I've only tested it with chrome. I've had some conversations with other folks at Remind and a couple other thoughts were thrown around including using HTTP/2 to support a streaming interface Here's the streaming interface I have on scratch branches. I'll clean it up and put out a couple PRs on server and on client. Client (some copy-paste from BatchNetworkInterface):
Server:
|
The way I did it for our server is |
Could follow the convention of ldjson, e.g.: https://www.npmjs.com/package/ldjson So instead of returning an array of responses, return responses separated by newlines. |
Would this possible through Websockets and/or Server Sent Events as well? I'm already using Would be sweet to request through socket and receive downstream as they resolve |
I'm highly interested if there has been any traction with this? I have some use cases where I could really take advantage of this idea of streaming results as they come in. |
I suppose as a workaround you can use a graphql subscription to send the
data if it is arriving in pieces.
…On Sat, Sep 29, 2018, 2:58 PM Justin Graber, ***@***.***> wrote:
I'm highly interested if there has been any traction with this? I have
some use cases where I could really take advantage of this idea of
streaming results as they come in.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#396 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAUAmTbD_iAAGYPDdIxCbppCeDaFS6XTks5uf-0YgaJpZM4NhdkS>
.
|
Thanks, I'll certainly look into it. |
@gauravmk this is a really interesting idea! We are planning with Apollo Server three to implement a more flexible transport layer and add support for |
The proposal for transports is now open on #3184. We still have a way to go before |
Can you show an example pls. |
Because apollo-client automatically batches requests, you can sometimes get in a situation where a single slow query can block a whole page from loading instead of optimizing for "time to interactive". One way of solving this would be for client programmers to manually adjust the batching behavior. A more automatic and efficient way is for the server to stream results as each request completes.
You could imagine an option to have a "Streaming GraphQL Server" which sends down chunked responses as each query computes. Clients would have to be able to handle results coming in arbitrary orders.
I hacked together something quick in express using our own API and the graphql-server library
The second query is much much faster than the first, and they get sent down in the order they finish.
The text was updated successfully, but these errors were encountered: