RayDF
RayDF copied to clipboard
Have any code to process my own datasets?
Now, I have new questions. How much memory of GPU is needed. My GPU has 8G and it print out of memory.
Now, I have new questions. How much memory of GPU is needed. My GPU has 8G and it print out of memory.
Thanks for your attention to our paper. We ran the experiment on a GPU with 24G memory (RTX3090). The memory usage depends on the settings of N_rand and N_views. It is suggested to decrease these hyper parameters if the GPU memory is limited.
May I ask what type of dataset you own? Is it RGB frames/video or depth scans?
May I ask what type of dataset you own? Is it RGB frames/video or depth scans?
I have several datesets for 3D reconstruction. They are all RGB frames and can be aligned using Colmap. So I wonder how to transform my own datasets into the format that this code requires.
May I ask what type of dataset you own? Is it RGB frames/video or depth scans?
I have several datesets for 3D reconstruction. They are all RGB frames and can be aligned using Colmap. So I wonder how to transform my own datasets into the format that this code requires.
same question.
May I ask what type of dataset you own? Is it RGB frames/video or depth scans?
I have several datesets for 3D reconstruction. They are all RGB frames and can be aligned using Colmap. So I wonder how to transform my own datasets into the format that this code requires.
For camera settings, we use colmap2nerf.py from Instant-NGP to convert the Colmap outputs (cameras.txt, images.txt) to transforms.json.
As for distance supervisions, you can first load depths obtained by Colmap (*.geometric.bin) and convert them to distance values using the coversion function convert_d(d, scene_info, out='dist')
provided in utils/math.py.
About the hyper-parameters sphere_center
and radius
, they can be computed from the mesh obtained by Colmap. In particular, sphere_center
is the center of the mesh, and radius
is half the maximum length of the mesh's bounding box.
Then you can rewrite the dataloader script data/load_${dataset}.py according to your own dataset.