SUSTechPOINTS icon indicating copy to clipboard operation
SUSTechPOINTS copied to clipboard

Out-of-memory error when playing a longer dataset (>400 frames)

Open wittmeis opened this issue 3 years ago • 5 comments

When I play a larger dataset the JS heap size increases to the maximum of ~3GB and the browser throws an out-of-memory error.

Any idea how to fix this? From the heap snapshot it looks to me as if the old worlds are not properly deleted.

wittmeis avatar Nov 23 '22 19:11 wittmeis

I just had a look at the heap snapshots. After calling world.deleteAll(), Lidar.pcd still exists. This is definitely one of the reasons for the increasing heap size.

I simply set Lidar.pcd = null in Lidar.remove_all_points(). I am not sure whether this is the correct fix but it reduced the heap size quite a bit.

However, the webglGroup in the world object seems to be also an issue. Hence, I also set that to null in world.deleteAll().

Then the heap snapshot looks like this after deleting all worlds by calling editor.data.worldList.forEach((w) => w.deleteAll()):

image

It seems that there are further references in the annotation object that prevent the garbage collection?

wittmeis avatar Nov 23 '22 19:11 wittmeis

the main problem could be here, try changing this line to return distant. or try the branch fusion, I fixed some bugs related to memory leaks but haven't had time to merge them into other branches yet.

naurril avatar Nov 24 '22 04:11 naurril

Thanks, I tried the proposed solution with return distant but unfortunately the leaks are still there.

I will have a look at the fusion branch as well though.

wittmeis avatar Nov 24 '22 15:11 wittmeis

Have you solved the problem with the fusion branch?

nnop avatar May 28 '24 06:05 nnop