gator
gator copied to clipboard
Mamba still running after closing jupyterlab
Description
When gator is installed, mamba is still running after jupyter lab is closed: the GUI is closed but the mamba process is not completed and in the system monitor, I can see that mamba search --json is still running for about 1min 30s. Once this is done, the process is completed, i.e I can use the terminal prompt again.
If mamba is not running when closing jupyterlab, then this issue doesn't occur.
Reproduce
- Open jupyterlab
- Wait it is opened
- Close jupyterlab
Expected behavior
The process should be stop immediately once jupyterlab stops.
Context
- Python package version: 5.0.0
- Extension version: @mamba-org/gator-lab v3.0.0 enabled OK (python, mamba_gator)
- General information:
active environment : base
active env location : /opt/miniconda3
shell level : 1
user config file : /home/eric/.condarc
populated config files : /home/eric/.condarc
conda version : 4.10.0
conda-build version : 3.21.4
python version : 3.8.8.final.0
virtual packages : __cuda=11.2=0
__linux=5.11.10=0
__glibc=2.32=0
__unix=0=0
__archspec=1=x86_64
base environment : /opt/miniconda3 (writable)
conda av data dir : /opt/miniconda3/etc/conda
conda av metadata url : https://repo.anaconda.com/pkgs/main
channel URLs : https://conda.anaconda.org/conda-forge/linux-64
https://conda.anaconda.org/conda-forge/noarch
package cache : /opt/miniconda3/pkgs
/home/eric/.conda/pkgs
envs directories : /opt/miniconda3/envs
/home/eric/.conda/envs
platform : linux-64
user-agent : conda/4.10.0 requests/2.25.1 CPython/3.8.8 Linux/5.11.10-200.fc33.x86_64 fedora/33 glibc/2.32
UID:GID : 1000:1000
netrc file : None
offline mode : False
Command Line Output
Paste the output from your command line running `jupyter lab` here, use `--debug` if possible.
Browser Output
Paste the output from your browser Javascript console here.
Hey @ericpre thanks for reaching out. When you say closing JupyterLab you mean closing the Webbrowser only? Or you mean shutting down the Jupyter server in the terminal?
Thanks for the quick reply and sorry for not being specific. With closing jupyterlab, I mean: "File" -> "Shut down" from the jupyter lab interface. If I kill the jupyter server from the terminal with "Control-C" it is different and the `mamba search --json" process is being stopped immediately.
If jupyterlab have been running for long enough (more than it takes for mamba search --json to complete), in which case this process is not running, then jupyterlab closes as expected.
It seems that there are two different things going on:
mamba search --jsontakes 1 or 2 min to complete, does it make sense?- gator doesn't kill
mamba search --json
Thanks for the reply.
I try not to get too much in technical details but trying to explain what is going on.
So first about mamba search --json:
- This is the command used to list all available packages. Unfortunately conda packages handling as a design flaw, it needs to download the complete list of packages from the channels each time it refreshes the available list (incremental update is not available). So new packages being added every day on conda-forge, that command is doomed to take always more time.
- So unfortunately yes it takes quite some time
About why gator is not killing the task:
HTTP request between the frontend and the backend must be stateless and quick. So tasks in the extension are run in separated executors. The frontend needs then to ping the server to obtain the task status (and reply if it is finished). Unfortunately if the server shutdown is requested, I'm not aware of a notification system that will allow the extension to stop still running tasks - I asked on the server dev chat to know if this is possible.
Even if such notification exists, stopping running tasks needs to be done carefully because for example environment modification tasks should not be cancelled as it can corrupt the environment.
Thanks for the detailed explanation!
On mamba search --json being slow: if I run mamba search only, it only takes a couple of seconds (4-5s) while mamba search --json is taking 1min30s on my setup. Is it something I shall report to mamba?
Even if such notification exists, stopping running tasks needs to be done carefully because for example environment modification tasks should not be cancelled as it can corrupt the environment.
Yes, indeed and we should expect users not to kill the server (or the process) when they started an update, which should be fine because an update would have triggered by the user.
I have noticed that gator starts the mamba search --json every time that jupyterlab is started and this is not very convenient when opening a notebook just to check or copy some of its content, which I trend to do quite often. Maybe, the mamba search --json process could be started only when gator is opened (through the menu "Settings/conda packages manager")?
mamba json
I'm not able to run that command using mamba 0.9.2 & conda 4.10.0.
Which version of mamba are you using?
I have noticed that gator starts the mamba search --json every time that jupyterlab is started
As the search take some times the idea is to trigger the listing refresh as quickly as possible. But your comment makes sense - it would be good to force this initial update only if the file is not there or if it is old (old being an user setting)
@ericpre contribution are welcomed if you want to give a try to implement this enhancement.
mamba jsonI'm not able to run that command using mamba 0.9.2 & conda 4.10.0.
Which version of mamba are you using?
Sorry, it was a typo! I meant mamba search without --json - I edited the comment above.
@ericpre contribution are welcomed if you want to give a try to implement this enhancement.
I suspect that I don't know much about it, but I may have a look at it at some point - most likely not in the near future.
Sorry, it was a typo! I meant
mamba searchwithout--json- I edited the comment above.
No problem - the json option forces the command to output its results as json to be easily handled by code (like in this extension).
I suspect that I don't know much about it, but I may have a look at it at some point - most likely not in the near future.
No pressure 😉