vision_blender icon indicating copy to clipboard operation
vision_blender copied to clipboard

Increased RAM usage with each render

Open satpalsr opened this issue 2 years ago • 2 comments

I was creating a synthetic image segmentation dataset of a few thousand images. I noticed that with each render the RAM usage increases making the consecutive image rendering slow.

To reproduce:

  1. Open Blender.
  2. Set rendering engine: cycles. Turn on Vision Blender for image segmentation.
  3. Make a few changes in rendering settling like lower noise, the fast GI approximation etc to make rendering fast.
  4. Write & execute the following python code to render.
import bpy

base_path = './'

for i in range(0,5000):
    bpy.context.scene.render.filepath = str(base_path +'rgb/' + f'{str(i).zfill(6)}.png')
    bpy.ops.render.render(write_still=True)
    print(i)

You can also directly download these blender files to reproduce.

satpalsr avatar Apr 11 '22 11:04 satpalsr

Very interesting, I will have a look, maybe I need to delete the temporary files as the ground truth is being generated.

Cartucho avatar May 30 '22 10:05 Cartucho

I checked and I was already deleting the temporary files.

I think the main cause of this issue is that we keep writing temporary files, which is not ideal. Initially, I wanted to get the data directly from the ViewerNode, but it turns out that there is a bug that prevents me from doing this.

I could try writing all the data into a single OpenEXR MultiLayer file, instead of multiple OpenEXR files, as I am doing now. Not sure if it would fix it.

Also, I found that if at the end of rendering we run the command bpy.ops.outliner.orphans_purge() I free up the memory. I tried putting this command being run after each rendering but sometime it crashes the entire program. So I prefer not to add it there.

Cartucho avatar May 30 '22 15:05 Cartucho