make sandbox compatible with edit server (multiple notebooks)
Description
Currently --sandbox works great for single notebooks.
However, when you start the edit server, two problems occur:
- we have one isolated environment for all the notebooks
- the script metadata per PEP 723 is ignored on notebook launch, instead required packages are auto-discovered.
It would be lovely to make sandbox compatibel with this mode, since we can then really utilize the power of UV with it's global cache and users can switch projects easily without worrying about dependencies and environments. Really beginner friendly that way and no terminal needed to create / activate manually managed environments.
Suggested solution
Create some sort of logic that activates and deactivates a uv isolated environment on notebook launch and shutdown from the edit server.
Alternative
No response
Additional context
No response
Oh, interesting... didn't realize this was an issue. So what is actually happening right now when I run marimo edit --sandbox is that packages for all notebooks are using the same env?
And it sounds like this would explain why I am being asked to re-install packages when reopening a notebook?
FWIW, have been working this way for a bit, and it does seem to be working-ish:
$ marimo edit --sandbox
Running in a sandbox: uv run --isolated --no-project --with-requirements /tmp/tmpz2c8w8sx.txt marimo edit
Create or edit notebooks in your browser 📝
URL: http://localhost:2718?access_token=lGtxh53-cAaCqe8l1yQzGA
Resolved 5 packages in 322ms
Prepared 3 packages in 106ms
Installed 3 packages in 20ms
+ charset-normalizer==3.4.0
+ requests==2.32.3
+ urllib3==2.2.3
Resolved 6 packages in 441ms
Prepared 6 packages in 2.62s
Installed 6 packages in 405ms
+ numpy==2.1.2
+ pandas==2.2.3
+ python-dateutil==2.9.0.post0
+ pytz==2024.2
+ six==1.16.0
+ tzdata==2024.2
@gabrielgrant - it does re-install the packages, but since they are cached by uv, it should be pretty fast
I have been using Marimo for a couple of days and it is an exciting project but I think this is the biggest challenge right now for if I want to have multiple notebooks running on a remote server. Ideally we could install Marimo on our remote environment like we do with Jupyter and then have each notebook be sandboxed. This would get us closer to the current workflow where users make different notebook kernels that they can use. Also is it possible for each notebook to run a different version of Python?
Being able to reference a pyproject.toml when making a new notebook would also be nice so that users don't need to do uv add ... each time they make a new notebook.
I'm facing the same issue; while it is not a big problem having to reinstall the dependencies each time (as stated by @mscolnick, uv cache helps a lot here), it does break things when the import module doesn't have the same name as the package.
My user case: I use the python-docx package that is imported with import docx. The dependency is there
# /// script
# requires-python = ">=3.13"
# dependencies = [
# "marimo",
# "pandas==2.2.3",
# "python-docx==1.1.2",
# ]
# ///
but when I run marimo edit --sandbox <root directory> I get asked to install the docx package, which is not compatible with my code.
@sanzoghenzo this seems like maybe another issue with the hard-coded package name mappings? @mscolnick describes those here -- does the issue you're experiencing seem similar to what is described in #2844?
Thanks for the feedback!
I forgot to update this issue, I already filed the PR (and it's already been merged) to add the right mapping docx -> python-docx (and the same for pdfminer -> pdfminer.six) 😉
Hey, adding to this request. Let me know if a new issue is in order.
Issue
When I have a python script with a TOML header (as I often do with scripts I run with uv), I would expect the same behavior as outlined here: https://docs.astral.sh/uv/guides/scripts/#declaring-script-dependencies, specifically, ...uv will automatically create an environment with the dependencies necessary to run the script...
Expected behavior
It runs without any errors/prompts, as I already defined in the TOML header the imports. This would match how uv run script.py works outside on a regular host, where it auto-creates a venv for that script, installs the deps from the TOML header, and runs.
Actual behavior
It says "not installed" each time, and I have to manually click install. It does however add missing deps to the toml header, which is weird that it does that but can't seem to read from it?
ModuleNotFoundError
No module named 'requests'
See the console area for a traceback.
Background
I created a Dockerfile below that I use (so I can host and access remotely).
FROM ghcr.io/marimo-team/marimo:latest-sql
EXPOSE 8080
CMD ["marimo", "edit", "--sandbox", "--no-token", "-p", "8080", "--host", "0.0.0.0"]
ran it with this:
services:
marimo-webapp:
ports:
- 127.0.0.1:3002:8080
container_name: marimo-webapp
volumes:
- ./scripts:/app/scripts
- ./data:/app/data
- .marimo.toml:/app/.marimo.toml
- uv-cache:/root/.cache/uv
-
build: .
restart: unless-stopped
volumes:
uv-cache:
and create a script.py like this (note I tried having the TOML header as shown below and within the cell, same issue.)
# /// script
# dependencies = [
# "marimo",
# "requests<3",
# "rich",
# ]
# ///
import requests
from rich.pretty import pprint
resp = requests.get("https://peps.python.org/api/peps.json")
data = resp.json()
pprint([(k, v["title"]) for k, v in data.items()][:10])
import marimo
__generated_with = "0.14.9"
app = marimo.App(width="medium")
@app.cell
def _():
import requests
from rich.pretty import pprint
resp = requests.get("https://peps.python.org/api/peps.json")
data = resp.json()
pprint([(k, v["title"]) for k, v in data.items()][:10])
return
@app.cell
def _():
import marimo as mo
return
if __name__ == "__main__":
app.run()
When I click on the script to open it in the web ui and run it, it triggers the dep. detection and requires me to manually install each dep, even though the TOML header has it all in there already. It should behave the same as a uv script where everything is taken care of for the user.
This not working is known behavior. But we had a backend change that should make this easier- thanks for bumping the post
This not working is known behavior. But we had a backend change that should make this easier- thanks for bumping the post
Ah good to know (it was known). Would the change make it work the same as uv? Not sure what "easier" means in this case. Out of curiosity, what was the goal of supporting the TOML header than? Just portability of the script, so one can share it? But if that's the case and auto-discovery drives the installation of the deps...seems like it doesn't work as intended?
The difficulty is "hot swapping" the environments. As is, the server is served off one process. So the model just needs to shift to spawning but managing multiple processes
The TOML header does work for marimo edit --sandbox mynotebook.py and is super useful for that.
The difficulty is "hot swapping" the environments. As is, the server is served off one process. So the model just needs to shift to spawning but managing multiple processes
The TOML header does work for
marimo edit --sandbox mynotebook.pyand is super useful for that.
Ah gotcha, thanks for the clarification!