2d-gaussian-splatting icon indicating copy to clipboard operation
2d-gaussian-splatting copied to clipboard

reduce VRAM requirement

Open alexjunholee opened this issue 1 year ago • 4 comments

Hi,

I use a system with a single graphic card (VRAM 16GB) and high memory instead (because RAM expansion is always cheaper than VRAM expansion) Therefore, I had to lower the volume of VRAM and load the images into the memory.

Simply not loading the original_image to the cuda in class Camera, I got what I expected. There are no other problems, as in the training part, the loss function is already calculated with the following:

gt_image = torch.clamp(viewpoint.original_image.to("cuda"), 0.0, 1.0) l1_test += l1_loss(image, gt_image).mean().double()

Please merge if you find this useful. Thanks for your great work!

alexjunholee avatar May 27 '24 01:05 alexjunholee

Thank you for your PR. Indeed this is important. However, I think it will cause increased training time? because we need loading the image into GPU every iteration. Perhaps, for large scales where there are thousands of images, a more smart data loader should be implemented. It think it would be great to leave for future development.

hbb1 avatar May 27 '24 15:05 hbb1

Hi, can you add an augment like data_device so that we can control the device to put the data.

hbb1 avatar Jun 10 '24 04:06 hbb1

I think this maybe useful when processing large amount images, or with --resolution 1. Update: parameter --data_device is just used for this. No need to change the code now.

oUp2Uo avatar Jul 02 '24 06:07 oUp2Uo

Thanks for the nice code, it helps a lot! :)

Zerui-Yu avatar Sep 28 '24 15:09 Zerui-Yu

Ohhh, I see the data_device augment. Ready to merge!

hbb1 avatar Dec 30 '24 11:12 hbb1