`dictionary` with large amounts of data
Hello!
We use your gem to make copies of organizations within our application. There are many objects tied to an organization, and some things are indirectly circular, so I am using the dictionary option to clone a couple of things manually beforehand before then calling deep_clone over the entire organization. Also, without the dictionary, deep_clone sometimes cloned objects multiple times because of the dependencies.
But, the dictionary makes my system run out of memory, as there is too much stuff in the Hash. I am wondering if you have any tips to somehow reduce the memory footprint. Ideas that are in my head:
- Use the
to_srepresentation of an object as key to map to the cloned object (or a SHA512 hash?). - Allow to define that only certain types of objects need to be looked up in the dictionary (I somewhat know my problematic dependencies).
Appreciate your work and response.
Hi,
Both seem to be sensible options. Maybe these optimizations should be optional and can be toggled via a dictionary_options param. Something like:
pirate.deep_clone(dictionary_options: { key: :itself, models: :all }) # default
pirate.deep_clone(dictionary_options: { key: :id, models: %i[mateys treasures] })
Could you maybe create a branch for something like this?