SegmentAnythingin3D icon indicating copy to clipboard operation
SegmentAnythingin3D copied to clipboard

How to render the depth map from SA3D?

Open Lizhinwafu opened this issue 1 year ago • 5 comments

How to render the depth map from SA3D?

Lizhinwafu avatar Sep 10 '24 14:09 Lizhinwafu

You may need to change the code in lib/render_utils.py. Here, we convert the depth to colormap. You can save the original depth directly rather than converting it.

Jumpat avatar Sep 14 '24 03:09 Jumpat

How to understand 'aligned'? Theoretically they should be. However, the depth is estimated by NeRF, which may not align with the actual situation. But it should align with the rendered RGB.

3DGS can export depth map. However, 3DGS is represented by a set of 3D Gaussians, which can be regarded as a kind of point cloud. Thus, we can directly export these points as the point cloud rather than use the depth map to estimate one.

Jumpat avatar Sep 14 '24 03:09 Jumpat

Aligned: Each pixel in RGB corresponds one-to-one to each pixel in the depth map. If they are aligned, I can detect the object in RGB and combine it with the depth to get a 3D model.

Yes, they are.

Another question, I want to customize the view of image rendering, which code should I modify? I only need to define four angles.

To define the rendering view manually you need to modify the camera poses, which is not an easy task. You can find the use of the camera matrices here.

Jumpat avatar Sep 15 '24 12:09 Jumpat

The NeRF you use is TensoRF, how can I change it to other models?

In this codebase it is hard to change it, but I have integrated it with 3D-GS. Currently the SA3D-GS branch has some bugs, you can refer to this repo for a fixed version.

Jumpat avatar Sep 22 '24 06:09 Jumpat

I found that after running run.py, there are 120 rendered images. Where are these camera poses saved (rotation and translation matrices)?

We did not save this trace. But you can refer to here for its generation code.

Jumpat avatar Sep 22 '24 06:09 Jumpat