peakRAM icon indicating copy to clipboard operation
peakRAM copied to clipboard

peakRAM with parallel processing

Open talegari opened this issue 8 years ago • 2 comments

@tpq Thanks for a handy package, I had been using a rough utility of mine for the same: https://gist.githubusercontent.com/talegari/ad06da7795b8771e2e152f304ca00f6f/raw

Do you have an idea to compute the peak RAM when multiple cores are used when a sock/fork cluster is instantiated?

talegari avatar Jan 16 '17 10:01 talegari

@talegari Thank you for your interest in peakRAM! Admittedly, anything to do with parallelizing R really pushes the limits of my knowledge. However, I'm happy to give this problem a think-over. Do you happen to have a simple piece of reproducible code I could try out?

I wonder whether something like the following pseudo-code could work. I assume the garbage collector will detect RAM use regardless of the number of R processes distributed? Maybe not, though...

makeCluster <- function(){
  # make cluster
  # deliver jobs across multiple cores
  # close cluster
}

peakRAM::peakRAM(makeCluster())

tpq avatar Jan 16 '17 10:01 tpq

Since this issue is still opening:

I notice that my R codes actually uses almost 15 GB of RAM, by HTOP. Meanwhile the recording RAM usage is merely 300 MB by peakRAM. The code block uses several functions from BiocParallel. I'm still checking whether the RAM issue was caused by parallel methods, or just because some methods cannot be monitored by peakRAM.

plantton avatar Mar 12 '21 10:03 plantton