You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, in the context of 1309 I have a custom HttpClientHandler implementation using a custom HTTP2 client implementation which I provide via the ChannelSettings. I can support streaming data up to the server in chunks and also down but I am not able to figure out how I can actually read the original grpc request content in a streaming mode so I can then forward these client messages to the server in the moment they are available.
then the first handler will print CONTENT 1 only after the client stream has reached EOF (after you call call.RequestStream.CompleteAsync(). All my buffered messages will be sent at this point.
The second handler will actually complete immediatly and print CONTENT 2 right after headers have been sent. However, the stream returned there is some empty MemoryStream() and not the actual underlaying client stream which I could read messages from.
In the other direction, to support server streaming, the grpc implementation excpects me to provide a HttpContent object which supports blocking reading from the embedded stream for each time the server sent a message. But it seems that this behavior is then not replicated for the client stream so I could do the same on my side..
At the moment I see no way how to actually implement this with the given API, I am even wondering how NET does it with it's own HTTP2 implementation..
I would appreciate if someone could provide some input/insight as I feel I am at my wits end here.
The text was updated successfully, but these errors were encountered:
I'm pretty sure request.Content.ReadAsStreamAsync() won't do what you want. It has been a couple of years since I did low-level work with intercepting content, but I found that it buffered.
Just to answer this: The behavior of HttpContent.Content.CopyToAsync is that it will replace the underlying Stream with the stream provided to the function call as long as nothing has yet been written. This allows you to inject your own stream and implement blocking (and unblocking) behavior as required when grpc actually writes to the stream and avoid any buffering.
Hi, in the context of 1309 I have a custom HttpClientHandler implementation using a custom HTTP2 client implementation which I provide via the ChannelSettings. I can support streaming data up to the server in chunks and also down but I am not able to figure out how I can actually read the original grpc request content in a streaming mode so I can then forward these client messages to the server in the moment they are available.
When I call :
then the first handler will print
CONTENT 1
only after the client stream has reachedEOF
(after you callcall.RequestStream.CompleteAsync()
. All my buffered messages will be sent at this point.The second handler will actually complete immediatly and print
CONTENT 2
right after headers have been sent. However, thestream
returned there is some empty MemoryStream() and not the actual underlaying client stream which I could read messages from.In the other direction, to support server streaming, the grpc implementation excpects me to provide a HttpContent object which supports blocking reading from the embedded stream for each time the server sent a message. But it seems that this behavior is then not replicated for the client stream so I could do the same on my side..
At the moment I see no way how to actually implement this with the given API, I am even wondering how NET does it with it's own HTTP2 implementation..
I would appreciate if someone could provide some input/insight as I feel I am at my wits end here.
The text was updated successfully, but these errors were encountered: