posterior
posterior copied to clipboard
Warn users if multicore summarise_draws risks exhausting available ram
My first big test of multicore summarise_draws on a really big (~9 GB in memory) array plowed straight into the swap on my 40 GB machine and I had to kill it. Second test, after reducing cores from 4 to 3, was awesome!
If a naive user runs their big model for days and then overenthusiastically runs summarise_draws prior to saving, they could be in a world of hurt.
Given that the main use case for multicore processing with summarise_draws involves really big draws arrays, we could consider adding a .ram_safety = TRUE argument, and then doing:
if (.cores > 1 & .ram_safety & (memuse::memuse(object.size(x)) > (memuse::Sys.meminfo()$totalram / (.cores + 1)))) {
warning(paste("The memory requirements of multi-core processing with", .cores, "cores may exceed the RAM on your system.",
"Consider using fewer cores. If you wish to proceed with", .cores, "cores, set `.ram_safety = FALSE`",
"and save any important objects prior to running."))
} else {
# Do the computation
}
Main drawback is the extra package requirement for multicore (which could go in suggests).
Thanks! I will think a little more about it and then revisit this issue. In the meantime, what do others think?
I'll think a bit more about this too, but in the meantime @jsocolar do you want to add a note about RAM in the doc? Regardless of whether we go a step further and add the argument/warning/dependency we should definitely document the concern about RAM for really large inputs.