jupyterhub-deploy-docker
jupyterhub-deploy-docker copied to clipboard
Q: A few questions about the JupyterHub via Docker deployment model ...
Hello Friends:
I'm considering using this deployment model -- https://github.com/jupyterhub/jupyterhub-deploy-docker -- for deploying JupyterHub via Docker on a PC with very robust specs (i.e. a high-end gaming PC, though I don't use it for that). Before selecting this deployment model road, I'd like to ask friends here a few questions.
-
My use-case will be giving screenshare classes to between 10 to 20 students. Recognizing that it depends on what Notebook cells would be concurrently doing, generally speaking, is this user size reasonable for this deployment model (again
JupyterHub via Dockeron a stout PC). -
On that same PC, I have
GitLab CEinstalled, which I would like to use as theOAuthprovider (instead ofGitHub). Is there anything with this deployment model that would prevent configuring that? I suspect I would create a API App on myGitLab CEinstance instead ofGitHuband use that instead. -
Finally, is there anything in this deployment model that would prevent containers from accessing services / ports running on the
bare-metalPC host itself, for example, say,mongodbor whatever else? From what I read, containers use the sameDocker network, but can they access the bare-metal network, too? I can figure out how to do that -- I'm really just trying to understand if there are network show-stoppers before pressing on with this deployment model.
Thank you very much in advance!
nothing's stopping 2. lots of oauth methods supported. you can look into jupyter-server-proxy for proxying applications within the same container. within the same network isn't a problem, so you can containerize mongo... but as far as access to services on the bare metal... well, if your browser can hit it, the app should be able to as well. you probably won't be able to pass the single port into all of the containers at once
10-20 students on a stout PC ... not sure what this means. If you have 8 cores + 32GB of RAM, I think it'll be okay as long as you don't push it too hard. I've done concurrent work on a 2Gb 1 core machine with one other person, and it was a noticeable difference but absolutely bearable
nothing's stopping 2. lots of oauth methods supported. you can look into jupyter-server-proxy for proxying applications within the same container. within the same network isn't a problem, so you can containerize mongo... but as far as access to services on the bare metal... well, if your browser can hit it, the app should be able to as well. you probably won't be able to pass the single port into all of the containers at once
10-20 students on a stout PC ... not sure what this means. If you have 8 cores + 32GB of RAM, I think it'll be okay as long as you don't push it too hard. I've done concurrent work on a 2Gb 1 core machine with one other person, and it was a noticeable difference but absolutely bearable
Hi @mathematicalmichael
Thank you for your reply and sharing your experiences. I do apologize for my delay in expressing gratitude (I'm usually quite prompt ... almost always on the order hours, not months. LoL). =:) Thank you.
Hi! Support questions like this are best handled on https://discourse.jupyter.org/