Sorcar
Sorcar copied to clipboard
Memory Usage
I've noticed that the memory usage just keeps going up and up and does not get freed if you change the seed value on a select random node. The memory does not go down if you delete the object either. You have to restart blender to free up memory.
This issue is not exactly limited to select random node. Apparently any change in any slider just increases the memory usage regardless.
If you launch blender from terminal with: Blender --debug-memory in terminal and later press f3 for search "Memory Statistics" you can see the BLI_Mempool Chunk increase unlimited memory with a lot of items when using custom object and any Sorcar node. I wrote on devtalk to see if I could free up that memory from python.
https://devtalk.blender.org/t/is-it-possible-clear-bli-mempool-chunk-memory-from-python/11081
and here:
https://blenderartists.org/t/sorcar-bli-mempool-chunk-memory-increases-without-limits/1198207/1
The memory growth has been reduced even though it is still increasing but it is increasing much less than beforewith this: https://github.com/zebus3d/Sorcar/blob/development/nodes/inputs/ScCustomObject.py and this: https://github.com/zebus3d/Sorcar/blob/development/helper.py
The idea is only to make a backup of the original if there is not already a backup. Then we work on the original, without fear since that's what the backup was made for. This way we avoid being constantly duplicating, it is only duplicated if there is no previous backup.
The memory growth has been reduced even though it is still increasing but it is increasing much less than beforewith this: https://github.com/zebus3d/Sorcar/blob/development/nodes/inputs/ScCustomObject.py and this: https://github.com/zebus3d/Sorcar/blob/development/helper.py
The idea is only to make a backup of the original if there is not already a backup. Then we work on the original, without fear since that's what the backup was made for. This way we avoid being constantly duplicating, it is only duplicated if there is no previous backup.
This was just an illusion. It's not possible to do it this way because if you never delete the original object you can never start over from the beginning of the node tree :( but at least I think the reason for the increase in memory is the massive duplication and removal of objects throughout the working process.
I'll try a new approach: Duplicate the object once, and then copy the mesh data from the original object every other time. This way, duplication happens once and only one new object remains in the scene.
I have written this addon for testing: https://pastebin.com/0F5zwFr3
When you change the slider it increases the memory but it reaches a limit and never increases the memory again, so it makes me suspect that it is your update=ScNode.update_value that makes the memory increase :S
but with this other addon if it happens: https://pastebin.com/8d98PQwC
If you set the undo to 0, the memory increases in a very slow rate. This is not a bug, it's how the memory is used by the Undo system. Your script calls bpy.ops within a loop, and for each call, an undo step is added
Source: https://developer.blender.org/T72853
I'll try a new approach: Duplicate the object once, and then copy the mesh data from the original object every other time. This way, duplication happens once and only one new object remains in the scene.
Works. But there are some issues with it:
- Object location, rotation, etc. needs to be manually reset
- Object modifiers need to be manually removed
- Particle systems (if any) remain attached.
Basically, we need to reset all possible changes that users can make in an object through the viewport.