jupyter-resource-usage icon indicating copy to clipboard operation
jupyter-resource-usage copied to clipboard

Detecting CPU and RAM limits inside container

Open esevan opened this issue 4 years ago • 3 comments

Hi! I've been forwarded from the issue (See more detail in the link): https://discourse.jupyter.org/t/detecting-cpu-and-ram-limits-on-mybinder-org/4640

Since the CPU and memory total value reported by psutil is not reflecting docker hard limit value. docker options for cpu and memory respectively: --cpus=1 --memory=1g

$ docker run -it --memory 1g jupyter/minimal-notebook bash
jovyan@ae5ee84233e0:~$ python
Python 3.7.3 (default, Mar 27 2019, 22:11:17) 
[GCC 7.3.0] :: Anaconda, Inc. on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import psutil
>>> psutil.virtual_memory().total
16742875136 # I didn't want this value! but 1GB!!

To address this issue, I want to contribute by implementing the following stuffs.

  • [ ] Detect current OS
  • [ ] Check if it's inside container or host
  • [ ] Get cpu and memory metrics from cgroups info if it's inside container.

Any contribution guide would be so much appreciated!

esevan avatar Jun 08 '20 11:06 esevan

Just a comment. I have read your proposed link and found out the sum you do over the processes rss with the psutil. Nevertheless, it seems that the value total_rss of /sys/fs/cgroup/memory/memory.stat is more accurate (or at least similar to what docker stats reports). In any case, thanks a lot fro proposing this feature.

alelorca avatar Jun 08 '20 15:06 alelorca

it seems that the value total_rss of /sys/fs/cgroup/memory/memory.stat is more accurate

Thank you so much for your comment!

esevan avatar Jun 09 '20 00:06 esevan

dask has a function for calculating the actual CPU available: https://github.com/dask/dask/blob/1615724e22206112a6864bcd8b0d3f6afa07204e/dask/system.py#L15

dhirschfeld avatar Jul 29 '23 21:07 dhirschfeld