-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adjust buffer replay logic for restarts w/ new ports #5450
Conversation
Co-authored-by: Jason Grout <[email protected]>
Test failure looks legitimate:
|
I agree - this test failure looks legit and I should have noticed this yesterday (too anxious to get the weekend started). Unfortunately, I have no clue how that particular test works. I suspect these changes may introduce another "event" since a reconnect to the ports (via To be honest, these server changes aren't required for EG's remote kernels since a new session_key is introduced anyway (due to the way the kernel requests are proxied), which results in I've verified that restarts from Lab with EG kernels now work with only the fix provided by @jasongrout in jupyterlab/jupyterlab#8432. I also checked restarts with local kernels (that retain ports across restarts) still work using only the Lab change. So, IMHO, this PR could probably be deferred to jupyter_server - where kernel providers will definitely introduce port changes on restarts. |
I don't have the bandwidth to dive into this, I'd say defer it |
Iirc, I needed this when I was I insisting that the ports change on restart by explicitly setting the port caching to false (see the original issue) |
Thank you for the responses and clarification (Jason). Since this runs a non-zero chance of destabilizing behaviors due to the assumption that ports are static across restarts, I'm in favor of closing this PR and addressing this in jupyter_server - where the wiggle room for changes like this is a bit wider. |
Spent a bit of time with this and see that restarts produce the following for local kernels via the Notebook UI:
This does not happen using the Lab UI. The reason this works via the Notebook UI against an EG server is due to the lack of handling of the session key, which winds up skipping the replay logic where the issue occurs. It also appears that buffer replay is slightly different via the Lab UI since there no messages get buffered when the Lab UI is used. I suspect Notebook UI must be triggering a close() that Lab Ui doesn't - thereby triggering the buffering of the status messages across a restart. Here's the log sequence for a restart via Lab:
And that of restart via Notebook:
I think the proper change is that stream = buffer_info['channels'][channel] When that is done, restarts also work via Notebook when replays are active. However, the kernel.js test still has failures, but of a different nature: https://travis-ci.org/github/kevin-bates/notebook/jobs/688097176 and I just don't know that particular test mechanics to determine if it's just an overzealous test or a real side-affecting issue. |
This change derives from the issues jupyterlab/jupyterlab#8013 and jupyter-server/enterprise_gateway#756. The problem is that the server generally assumes that a kernel's ports remain the same across restarts. The Notebook UI reestablished its Websocket connection to the server on restarts, while Lab did not. However, even if Lab adds a reconnect (see jupyterlab/jupyterlab#8432) it does so using the same client id - which is reconnecting to the old channels (ports).
This change (proposed by @jasongrout - thank you!) preserves any buffer replay logic that might correspond to the client id, but unconditionally establishes the channels, irrespective of whether the underlying ports were changed, thereby restoring communication to the restarted kernel.