-
Notifications
You must be signed in to change notification settings - Fork 654
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix memory leak in grpc-js bidi streams #2653
Conversation
Signed-off-by: Matteo Collina <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've tested the fix in my environment and works like a charm!!
#2650 rewrites a lot of this server code, and it includes a change that should fix this bug. |
Did you run the test? I don't think that PR overlaps with this area of code. I think the streaming logic could be simplified further. |
That PR completely deletes and replaces the |
Specifically, I am referring to this section |
I think we can prepare a test that fails in that PR too. Specifically, there is no backpressure handled when calling |
There is backpressure, but there's a bit of action at a distance. At the interceptor API level, a single call to |
You are right! I would put a comment in https://github.com/murgatroid99/grpc-node/blob/eeaa6c0e6e360b4d884f4654c07796f7149645f3/packages/grpc-js/src/subchannel-call.ts#L386, because I think that's the After taking a look at that whole logic, I think it could be greatly simplified by using What's the timeline for the interceptor PR? Might be worthwhile shipping this in a patch. |
That link points to client code. The code there is similar, but it's not actually involved here. The pause call at a similar level on the server side is here. It's not really clear to me that using I'm hoping to get the server interceptor change out within the next week. |
Release 1.10.0 includes the server interceptors change. |
This PR fixes a relatively bad memory leak when the client sends a huge stream of messages to the server, which is slow in processing them. In the current implementation, messages where kept buffered inside
.messageToPush
array. This is pausing the underlining H2 stream, so that the memory leak does not occur.