-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
rpcclient: make sure batch requests are GCed #2105
Conversation
This commit makes sure the batch requests are always GCed before sending back the responses for them. In particular, - `removeRequest` didn't remove the item from `batchList`, which is now fixed. - `Send` didn't remove the request from `requestMap`, which is now fixed by using `removeRequest`.
Pull Request Test Coverage Report for Build 7600694273
💛 - Coveralls |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Really really great find!!! Basically LGTM - just one question
log.Errorf("Unable to marshal result: %v for req=%v", | ||
err, request.id) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what's the rationale for changing the error handling here? do we expect to get an unmarshal error here frequently?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since it's a list of requests batched together, if one or more requests failed, it should not affect sending responses to other requests. And nope, I think we should almost never get an unmarshal error.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, an unmarshall request here would mean that the full node sent over garbled json.
// TODO(yy): need to double check to make sure there's no | ||
// concurrent access to this batch list, otherwise we may miss | ||
// some batched requests. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah - and I guess we also miss all the ones that were in the list that didnt necessarily cause the failure...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah, need more love in this lib, more tests, etc
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM ☘️
This commit makes sure the batch requests are always GCed before sending back the responses for them. In particular,
removeRequest
didn't remove the item frombatchList
, which is now fixed.Send
didn't remove the request fromrequestMap
, which is now fixed by usingremoveRequest
.In addition, a
Copy
method is added toMsgBlock
to fix another heap escape found in lnd'sGetBlock
.