-
Notifications
You must be signed in to change notification settings - Fork 30k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
http2: refactor to avoid unsafe array iteration #36700
Conversation
lib/internal/http2/core.js
Outdated
@@ -1663,12 +1667,12 @@ class ClientHttp2Session extends Http2Session { | |||
this[kUpdateTimer](); | |||
|
|||
if (headers !== null && headers !== undefined) { | |||
for (const header of ObjectKeys(headers)) { | |||
ArrayPrototypeForEach(ObjectKeys(headers), (header) => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for..of
loop has higher performance as compared to forEach
(correct me if I am wrong)...
So are we still fine going with forEach
for this specific case here?
P.S. Just started with open-source reviews and was trying to learn from other PRs.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, for..of
is more performant which is why it was chosen to begin with. It's also why this pull request will need a benchmark run before it can be merged. The problem with for..of
is that it is vulnerable to prototype pollution attacks. If some third party module deep in your dependencies decides to inject some code in Array.prototype[Symbol.iterator]
, then for..of
will happily run that code. That may or may not be a problem for the end user's code--there are legitimate use cases (like profiling) for this sort of so-called "monkeypatching", but we don't want Node.js core to run the altered code.
This comment has been minimized.
This comment has been minimized.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM if benchmarks are OK
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
@@ -3095,7 +3099,7 @@ Http2Server.prototype[EventEmitter.captureRejectionSymbol] = function( | |||
case 'stream': | |||
// TODO(mcollina): we might want to match this with what we do on | |||
// the compat side. | |||
const [stream] = args; | |||
const { 0: stream } = args; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Isn't this also "vulnerable" to prototype pollution "attacks," if we really want to consider that "unsafe?" I mean, one could define a getter on the Array.prototype
that returns something "malicious."
I'm really not sure if all this complexity and illegibility are worth the minimal benefit.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Defining a 0
property on the Array.prototype
would have effect only on empty arrays, right? args
here is created by the spread operator, so I think this code would actually be safe even in case of Array.prototype
pollution.
Object.defineProperty(Array.prototype, '0', {get(){return 9}, set(v){/* no op */}})
((t, ...args)=>args[0])(1, 2, 3) === 2 // true
I don't think we are concerned about code being "malicious" anyway, what we are trying to achieve with the move to primordials is to allow users to monkey-patch the built-in objects (for debugging purposes, for tinkering, etc.) without having errors thrown by Node.js internals.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
so I think this code would actually be safe even in case of
Array.prototype
pollution.
Might be, I am not sure.
without having errors thrown by Node.js internals.
I understand the motivation, but I am not sure it's worth the cost to maintainability, readability etc.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To address the maintainability concern, would you prefer if we added a linter rule to forbid array destructuring assignment?
bf0fb18
to
f261d70
Compare
Benchmark CI: https://ci.nodejs.org/view/Node.js%20benchmark/job/benchmark-node-micro-benchmarks/887/ (queued) |
I'm a bit of -1 since benchmarks showed some regression.
|
Another benchmark run confirmed there's a perf regression for low values of
|
Replacing Benchmark CI: https://ci.nodejs.org/view/Node.js%20benchmark/job/benchmark-node-micro-benchmarks/889
|
PR-URL: nodejs#36700 Reviewed-By: James M Snell <[email protected]> Reviewed-By: Rich Trott <[email protected]>
5cdcc12
to
0fd9bbb
Compare
Landed in 0fd9bbb |
PR-URL: #36700 Reviewed-By: James M Snell <[email protected]> Reviewed-By: Rich Trott <[email protected]>
Checklist
make -j4 test
(UNIX), orvcbuild test
(Windows) passes