jupyterlab-variableInspector
jupyterlab-variableInspector copied to clipboard
Large lists causes variable explorer to bog down Jupyter session
I have a large list of results from a Scipy equation solver with each element being of type scipy.integrate._ivp.ivp.OdeResult
. I've noticed that when I have a large results list, my Jupyter session becomes incredibly slow. From sys.getsizeof
, I can see that the lists is relatively small (~103 to 104 bytes). Whereas, a pandas dataframe that I have is much larger (~107) and has not caused the same issue. I've used timers to determine the code itself is still running at the expected speed. It's only the interaction with the Jupyter session that is slow. After disabling the variable explorer, my interaction with Jupyter returns to normal. I suspect the issue has to do with the way the variable explorer is probing the variables. I would guess that accessing the list of results is inefficient and hence when the variable explorer constantly probes it, it slows everything down. In fact, when I call the list of results, it takes much longer to return the list as compared to the larger pandas dataframe mentioned earlier.
- Is my intuition correct as to why the session slows down?
- Would it be possible to add the ability to quickly enable and disable the variable explorer? Or at least have the ability to increase the amount of time between variable probes? As far as I can tell, the only way to fix the issue for me is to disable the variable explorer and then rebuild.
This issue is critical and easy to reproduce
This issue can be very problematic as its both quite hard to debug, and can hamper work entirely in a JupyterLab environment just when heavy computational work has been completed.
Reproduction steps
First: initialize a minimal functional environment
Install a fresh conda environment.yml file like this:
# environment.yml
name: jl-debugging
channels:
- conda-forge
dependencies:
- numpy
- ipykernel
- jupyterlab
- python=3.9
- pip
conda env create -n jl-debugging -f environment.yml
Download this small.pickle file with numpy data.
Start jupyter lab from the jl-debugging conda environment next to the downloaded small.pickle
file, then create a notebook with the following two cells.
%%time
import pickle
with open("small.pickle", "rb") as f: # 38 MB file
my_var = pickle.load(f)
%%time
print("Execution is very delayed, but fast")
Run them - it should be quick, everything works fine!
Second: reproduce the issue by installing this extension
- Install the extension
pip install lckr_jupyterlab_variableinspector
- Reload the jupyterlab window
- Restart the kernel and clear outputs
- Run the two cells and note it lagging on the print statement!
Issue demonstrated as a GIF animation
Environment specification
Some packages with associated version of relevance to this bug report.
$ pip list
Package Version
--------------------------------- ---------
ipykernel 6.4.2
ipython 7.28.0
jupyter-client 7.0.6
jupyter-core 4.8.1
jupyter-server 1.11.1
jupyterlab 3.2.1
jupyterlab-pygments 0.1.2
jupyterlab-server 2.8.2
lckr-jupyterlab-variableinspector 3.0.9
notebook 6.4.5
numpy 1.21.3
First of all thanks @lckr for work on this extension, I've appreciated it and know many others has as well!
I just wanted to signal boost this issue report as I think it is a quite critical bug.
Thanks @lckr and others for a great extension!
A solution for this problem would be big, as it seems to still be common. I've largely had to stop using it, but would love to be able to pick it back up if the performance improved. https://github.com/jupyter/notebook/issues/3303 https://github.com/jupyter/notebook/issues/3224#issuecomment-382300098