Hi everyone,
Hi everyone,
I've been using the VS Code Jupyter extension to work on some large datasets, but I've noticed significant slowdowns and lag when running cells, especially those that involve heavy computations or large DataFrame manipulations.
- Has anyone experienced similar performance issues?
- Are there recommended settings or best practices to optimize Jupyter Notebook performance in VS Code?
- Would using a different Python environment or hardware setup help?
Any advice or tips would be greatly appreciated!
Thanks in advance!
Originally posted by @arjunresha in https://github.com/microsoft/vscode-jupyter/discussions/16898
Hi! I’ve run into similar slow performance in the VS Code Jupyter environment, especially when working with large DataFrames. A few things that helped significantly:
1. Disable Notebook IntelliSense
VS Code tries to analyze the entire notebook, which slows down editing and execution.
Settings → Notebook: IntelliSense → Disable
2. Limit the Variable Explorer
The explorer can freeze when it tries to inspect large DataFrames. Adjusting these helps:
- Jupyter: Variable Explorer Exclude Large Variables
- Jupyter: Variable Explorer Max Output Size
3. Disable automatic plot rendering
Large plots can cause UI lag.
Jupyter › Plot Viewer: Receiver Enabled → Off
4. Run Jupyter outside of VS Code
Heavy workloads run more smoothly in a browser-based interface such as JupyterLab.
5. Use a clean Python environment
A fresh lightweight environment often improves performance:
python -m venv venv venv\Scripts\activate pip install numpy pandas jupyter ipykernel
6. Avoid displaying very large DataFrames
Showing huge outputs can freeze the interface. Prefer: df.head() df.info()
These adjustments removed most of the lag for me. Hopefully they help!
Hi
Hi! I’ve run into similar slowdowns in VS Code when using the Jupyter extension with larger DataFrames and heavier computations, so here are a few things that helped:
-
Update VS Code & extensions Make sure you’re on the latest version of VS Code and the Jupyter + Python extensions – some performance issues have been improved in recent releases.
-
Close or limit the “Jupyter: Variables” panel When the Variables view is open, VS Code tries to inspect large objects, which can make cell execution feel very laggy, especially with big pandas DataFrames. Closing that panel or reducing what’s shown there can speed things up noticeably. Stack Overflow
-
Avoid printing huge DataFrames in the output Showing an entire large DataFrame in the notebook output can slow everything down. Instead, use:
df.head() df.sample(10) df[['col1', 'col2']].head()
to inspect just a small portion of the data. Big outputs are a common cause of slow notebooks. GeeksforGeeks
- Profile your slow cells You can use magics like:
%timeit some_function()
or %%timeit on cells to see which parts are really taking time, then optimize just those parts (e.g., vectorize pandas operations, avoid Python loops, etc.). Medium
-
Consider tools for larger-than-memory data If your datasets are truly large, libraries like Dask or Polars can be more efficient than plain pandas for some workloads. They’re designed to handle bigger data more gracefully. Reddit
-
Try JupyterLab or a remote environment as a comparison Sometimes it helps to compare the same notebook in classic Jupyter/JupyterLab vs VS Code. If performance is okay there but not in VS Code, that’s a strong signal it’s an editor/extension issue and not your code.
Hope this helps! If you share a minimal example (code + data shape), people might be able to give more specific suggestions.