Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can notebooks connect to a kernel together? #5140

Open
yangsenyi0116 opened this issue Jan 7, 2020 · 5 comments
Open

Can notebooks connect to a kernel together? #5140

yangsenyi0116 opened this issue Jan 7, 2020 · 5 comments
Milestone

Comments

@yangsenyi0116
Copy link

I have 2 notebooks. I want to share variables between two notebooks. How can I connect the kernels of the two notebooks to the same one?

@jasongrout
Copy link
Member

This is very easy to do in JupyterLab (open both notebooks, switch the kernel in one to point to the kernel of the other). It is very hard in classic notebook, since each notebook launches its own kernel session, and it does not provide an option in the GUI to select a kernel that is already running.

@jasongrout jasongrout added this to the Reference milestone Jan 7, 2020
@yangsenyi0116
Copy link
Author

I currently use the enterprise gateway, and I found that when the notebook is connected to the kernel, a request to /api/kernel/${kernel_id} is sent. Can the notebook fix this kernel_id through configuration?

@kevin-bates
Copy link
Member

As Jason said, this can be done via JupyterLab (and irrespective of EG). Here are two notebooks screen shots with kernels running in a YARN cluster via EG, the second of which I started, then changed the kernel to be associated with the kernel used for Untitled5.ipynb:
Screen Shot 2020-01-07 at 8 24 40 AM

Print variable i to demonstrate shared state. You can also see the execution counters appear to be shared as well...

Screen Shot 2020-01-07 at 8 25 05 AM
Jupyter Notebook doesn't expose this functionality to the best of my knowledge.

@akashd11
Copy link

Hi team,
Just an extension to this problem,
The solution to share the kernel kills the isolation of kernels.
In my use case we have a variable of type spark session that is set in the builtins of python, I just want the kernel to share this variables i.e. is there any way I can set the my spark variable in builtins module while the kernel starts.

@krassowski
Copy link
Member

@akashd11 I think you may be interested in the subshells, see:

Once this this is merged you will be able to have two subshells for the same kernel, sharing the variables but performing execution independently. I believe that only the latest pre-release of ipykernel supports subshells as of today.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants