Rich FitzJohn
Rich FitzJohn
Thanks for reposting this. My gut feeling is that this is a documentation issue, so can you (1) rerun your code that was memory intensive with and without changing this...
OK, cool. I'll have a go at clarifying the docs for the next release. Does it help the example you posted?
Ah, this is very useful. The documentation should make it clear that what you want is the opposite! You want to set `use_cache = FALSE`. Reading what is there I...
No sorry - that would be a breaking change and I don't think it's a better name. I'll improve the docs though. Does changing the value help with your memory...
Obviously it's up to you, but I think you should check for performance regressions with https://github.com/ropensci/drake/commit/6659efefac9cad5f867095e5a57e3dc404df0d5d - doing this will massively increase the number of disk reads you do
No, it does not limit cache size - doing that would require significant additional work (and therefore need careful thought before using) because we'd need to order the cache as...
> A good place to start might be to respect the memory limits of the R session itself. Beyond that, decisions get trickier, I agree. Perhaps you could be more...
That's quite different to not respecting "the memory limits of the R session itself" - it's just what happens when you're out of memory. There is no straightforward data structure...
There is a `flush_cache` method for pruning the whole cache at the moment. Any more than that would require an additional data structure for recording when the data was added,...
I don't really see the advantage of a random purge - that just adds nondeterministic behaviour. And cleaning the cache out is always "safe" - it just may slow things...