You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In working on the remote tunnels SDK, now being used in part for the 'exec server', Martin suggested we publish an official event npm module. I already had a small VS Code compatible module for use in my own code. I updated it to support the full set of EmitterOptions that we do in core, and out of curiosity compared it against the implementation in core. I suspect our usage of linked lists is responsible for the bulk of the performance delta: it's very nice a computer sciencey, but I think that we get dramatically punished by cache locality in practice. There is of course memory overhead for nodes as well.
I then gathered some initial data
on how we use events in vscode.
I initially hypothesized we'd have a bimodal distribution between listeners with a small number of events, and ones with large number of events. In reality, it's much closer to a power-law distribution. The plurality of event emitters have no listeners at all, and second most common is a single listener. This data was collected after VS Code started up with an editor open:
We can also estimate how long was spent in the delivery mechanics during boot. This is the time to create the delivery queue, not call each individual function, so it slightly underestimates the actual time.
Stats for the first 13 seconds of the editor:
Time spent in emitter overhead (ms) 33.0
Total number of event emissions: 13882
From a heap snapshot, I manually tallied memory usage per for emitters, which comes to a total usage of about 335KB. Again, this is an underestimation.
I believe the biggest benefit is an expected ~15ms improvement in startup time.
The text was updated successfully, but these errors were encountered:
My reworkings saved time, slightly less than expected, which I attribute to having to implement some more subtleties of vscode's emitter that my initial implementation lacked and weren't covered by tests. But, regardless, it seems to save roughly 10ms and is almost certainly faster than the old implementation when looking at startup performance.
In working on the remote tunnels SDK, now being used in part for the 'exec server', Martin suggested we publish an official event npm module. I already had a small VS Code compatible module for use in my own code. I updated it to support the full set of EmitterOptions that we do in core, and out of curiosity compared it against the implementation in core. I suspect our usage of linked lists is responsible for the bulk of the performance delta: it's very nice a computer sciencey, but I think that we get dramatically punished by cache locality in practice. There is of course memory overhead for nodes as well.
I then gathered some initial data
on how we use events in vscode.
I initially hypothesized we'd have a bimodal distribution between listeners with a small number of events, and ones with large number of events. In reality, it's much closer to a power-law distribution. The plurality of event emitters have no listeners at all, and second most common is a single listener. This data was collected after VS Code started up with an editor open:
We can also estimate how long was spent in the delivery mechanics during boot. This is the time to create the delivery queue, not call each individual function, so it slightly underestimates the actual time.
From a heap snapshot, I manually tallied memory usage per for emitters, which comes to a total usage of about 335KB. Again, this is an underestimation.
I believe the biggest benefit is an expected ~15ms improvement in startup time.
The text was updated successfully, but these errors were encountered: