ipykernel icon indicating copy to clipboard operation
ipykernel copied to clipboard

feat: Run exec_request inside a task

Open fleming79 opened this issue 9 months ago • 7 comments

This PR modifies the kernel to run exec_requests inside a task using Memory object streams to queue pending tasks. This means that other shell messages can be processed while the execution request is being processed.

ref: https://github.com/jupyterlab/jupyterlab/pull/17363#issuecomment-2760504700

New features

  • Adds the start_soon method to the kernel providing a thread safe equivalent to anyio.start_soon.

Breaking changes

  • set_parent & get_parent methods are removed. Instead the parent message is only set for the current execute request providing better certainty for inputs and outputs

Known issues

  • Tests pass locally but occasionally fail (need help)

Example

Screenshot

Before

https://github.com/user-attachments/assets/72befb94-7c22-4e8b-99c8-2ebdfad8e136

After

https://github.com/user-attachments/assets/c8bc0f73-69e8-4eab-b4e4-8100f8194b66

Notebook code

Requires:

  • ipywidgets
  • jupyterlab

import anyio
import time
import asyncio
from anyio import sleep
from ipywidgets import HTML, IntSlider, __version__

slider = IntSlider()
print('ipywidgets', __version__)

# async def run_till_100():
async def run_till_100():
    display(slider)
    reported_value = HTML()
    display(reported_value)
    cval = -1
    while True:
        time.sleep(0.1) # not a good idea as it blocks the main thread
        await sleep(0.1)
        if cval != slider.value:
            cval = slider.value
            reported_value.value = f"Reported value: {cval}"
            print(cval)
        if cval == 100:
            print('Done')
            await anyio.sleep(0.7)
            slider.value = 0
            await anyio.sleep(0.01)
            return 'Done'
await run_till_100()
await asyncio.create_task(run_till_100())
shell = get_ipython()
shell.kernel.start_soon(run_till_100)

fleming79 avatar Mar 29 '25 03:03 fleming79

Thanks @fleming79. That's going in the direction of akernel, so I can only approve :) One important thing though is kernel interrupt and how execute requests are cancelled (I don't see anything around that in the code).

davidbrochart avatar Mar 29 '25 12:03 davidbrochart

Thanks @fleming79. That's going in the direction of akernel, so I can only approve :) One important thing though is kernel interrupt and how execute requests are cancelled (I don't see anything around that in the code).

@davidbrochart - thank you for the review and positive feedback!

I've updated the code so it will now cancel all requests where the submitted (monotonic) time is prior to the aborted time.

fleming79 avatar Mar 30 '25 01:03 fleming79

Here is a demo video showing a forked version of ipylab using await extensively.

The modified version of ipylab defines async functions to communicate with the frontend via custom widget messages (Widget.on_msg(...)).

https://github.com/user-attachments/assets/5a0c0619-3a10-40e1-aaaa-2fc44d6f4c80

fleming79 avatar Apr 07 '25 04:04 fleming79

@davidbrochart - Hoping you can find time to review this PR.

Relevent:

  • https://github.com/ipython/ipykernel/issues/1387
  • https://github.com/jupyter-server/team-compass/issues/73#issuecomment-2776366008

fleming79 avatar May 21 '25 22:05 fleming79

Thanks a lot for your work in this PR @fleming79, this looks really interesting. I don't have much time to dedicate to ipykernel these days, and given that work is ongoing in the 6.x branch, I'm not sure what to do. I'm curious to hear @minrk's opinion on this.

davidbrochart avatar May 23 '25 08:05 davidbrochart

All tests pass locally except those that get skipped for some reason in the "inprocess" folder.

fleming79 avatar May 23 '25 11:05 fleming79

@davidbrochart @minrk @krassowski @ianthomas23 @Carreau

Dear all,

I've drafted a stripped down version of ipykernel starting from this PR and am contacting you seeking feedback/advice.

For the moment I've called it asynckernel and the source is here.

The motivation was to provide an asynchronous native kernel enabling comms to pass whilst a cell is busy awaiting a result. It does this by queuing non-silent execute_requests and running them in a designated task. Silent execute requests are run in their own tasks and are non-cancellable.

To try it out you can pip install it from the source: asynckernel-0.0.0a1.tar.gz

https://github.com/user-attachments/assets/0824f48e-ef89-407f-8b29-fd0012440102

Test code


import anyio
from ipywidgets import IntSlider, HTML, __version__
import time

slider = IntSlider()
value = HTML()
print('ipywidgets', __version__)
display(slider)
display(value)

async def run_till_100():
    while True:
        await anyio.sleep(0.1)
        value.value = str(slider.value)
        if slider.value == 100:
            return 'Done'
print(await run_till_100())

fleming79 avatar Jun 20 '25 07:06 fleming79

@fleming79 I presume you have realised that with the recent branch manipulations this now targets the anyio branch and the future of anyio in ipykernel is uncertain.

As for my opinion, I expect everyone to be in favour of concurrent async cell execution. I think it likely that it will be supported in ipykernel 8 whether that is via anyio or not.

The journey from here to there is difficult though. Enabling concurrent cell execution isn't really the issue (it would be trivial on the current main branch), the problem is that it breaks downstream projects that have quite reasonably assumed sequential cell execution. For example, execution counts won't be correct and shell status messages will no longer alternate between busy and idle. There are probably lots of other things that I personally don't know about.

What is needed is to canvas the opinions of Jupyter maintainers across the up and downstream projects, not just here in ipykernel. The recommended way to do this is to open an issue on the Jupyter Enhancement Proposals repo and ask for feedback, then we'd get to find out the extent of the breaking changes it would cause and if there is an appetite to go ahead.

ianthomas23 avatar Aug 15 '25 08:08 ianthomas23

@ianthomas23 Thank you kindly for the response and details about how to proceed.

I agree with you about breaking downstream projects which is why I'm thinking of a new package. It won't be fully compatible with downstream.

Since my previous post about a month ago I've made significant progress on a new kernel.

Long story short - it's approaching a state that I would like to make it more widely available. If you're interested to have a look there is some draft documentation here and it can be installed with:

pip install -i https://test.pypi.org/simple/ async-kernel

I'm thinking on making a release of it as a package on pypi proper and was wondering if you can suggest how I should deal with:

  • copyright on each page
  • copyright notice
  • Notes about the development team

Regarding a JEP - I think maybe something on the lines of 'run modes' might be worth consideration.

fleming79 avatar Aug 20 '25 10:08 fleming79

Two cents:

  • adding features to ipykernel that are behind a flag should not be that big of a deal. It would ideally go though a JEP and having a package where it is implemented first makes a JEP much easier, but I would not discount the idea of contributing this into ipykernel long-term.

krassowski avatar Aug 20 '25 10:08 krassowski