gaussian-splatting icon indicating copy to clipboard operation
gaussian-splatting copied to clipboard

About point cloud

Open ptg666 opened this issue 9 months ago • 23 comments

Rendering results are great on truck dataset, but exported point cloud is messy.I can't even find where the truck is. So my question is whether the point cloud derived from 3D Gaussian is suitable for point cloud-based tasks? 2023-11-04_12-04 2023-11-04_12-07

ptg666 avatar Nov 04 '23 04:11 ptg666

You can use open3d for visualization.

ljjTYJR avatar Nov 04 '23 12:11 ljjTYJR

short answer: not exactly.

One problem is what you showed: the messy background.

I don't know how much your task tolerate noises, but another problem with gaussian splatting is that for featureless regions (e.g. white wall), instead of many points on the region, it tends to only have few points with large scales, which is very different from point clouds you can get from e.g. multiview stereo or lidar.

kwea123 avatar Nov 04 '23 12:11 kwea123

@kwea123 Just seed more points at this areas and you get more precision GS. @ptg666 Meshlab isn't good for checking GS. But you can use it before trainning to clean/seed first point cloud (from colmap).

jaco001 avatar Nov 04 '23 13:11 jaco001

@jaco001 even if you seed more, they will eventually be removed by pruning, unless you fixed them

kwea123 avatar Nov 04 '23 13:11 kwea123

Just for some visualization of the result for demonstration, I think it is still valuable to use the point cloud generated after the Gaussian optimization. image

image

The left ones are initial input point clouds generated by the SfM, while the right ones are generated by Densification after the Gaussian optimization. Even with the pruning, we can see a more dense point cloud compared to others. To prevent pruning, you can even disable the densification.

But of course, the colourful-detailed regions will be denser than featureless regions.

ljjTYJR avatar Nov 04 '23 14:11 ljjTYJR

short answer: not exactly.

One problem is what you showed: the messy background.

I don't know how much your task tolerate noises, but another problem with gaussian splatting is that for featureless regions (e.g. white wall), instead of many points on the region, it tends to only have few points with large scales, which is very different from point clouds you can get from e.g. multiview stereo or lidar.

My task requires centimeter-accurate point clouds and cannot tolerate noise. @kwea123

ptg666 avatar Nov 05 '23 01:11 ptg666

Just for some visualization of the result for demonstration, I think it is still valuable to use the point cloud generated after the Gaussian optimization. image

image

The left ones are initial input point clouds generated by the SfM, while the right ones are generated by Densification after the Gaussian optimization. Even with the pruning, we can see a more dense point cloud compared to others. To prevent pruning, you can even disable the densification.

But of course, the colourful-detailed regions will be denser than featureless regions.

In the truck scene it doesn't work.Is it possible that the initial point cloud quality is not very good? @ljjTYJR

ptg666 avatar Nov 05 '23 01:11 ptg666

@ljjTYJR May I ask what software is used to visualize this? and which point cloud file is imported? I imported "output/name/point_cloud/iretation_30000/point_cloud.ply" into meshlab as shown in the first image. I expected to be able to produce the effect shown in the second image. QQ截图20231106140428 图片1

yangqinhui0423 avatar Nov 06 '23 06:11 yangqinhui0423

'Easiest' way is use SIBR_gaussianViewer_app.exe with white background. Meshlab don't use all data stored in point_cloud.ply from GS. (for now)

jaco001 avatar Nov 06 '23 07:11 jaco001

@yangqinhui0423

I read the generated 3D Gaussian point cloud similar to the way of https://github.com/graphdeco-inria/gaussian-splatting/blob/2eee0e26d2d5fd00ec462df47752223952f6bf4e/scene/gaussian_model.py#L215-L257

You can use the gaussian means as point coordination and convert the feature to the RGB color. For visualization, I simply use the open3d to visualize the generated point cloud.

ljjTYJR avatar Nov 06 '23 08:11 ljjTYJR

Just for some visualization of the result for demonstration, I think it is still valuable to use the point cloud generated after the Gaussian optimization. image

image

The left ones are initial input point clouds generated by the SfM, while the right ones are generated by Densification after the Gaussian optimization. Even with the pruning, we can see a more dense point cloud compared to others. To prevent pruning, you can even disable the densification.

But of course, the colourful-detailed regions will be denser than featureless regions.

Sir,May I ask How to get the point cloud with color?

hanhantie233 avatar Dec 19 '23 14:12 hanhantie233

@ljjTYJR May I ask what software is used to visualize this? and which point cloud file is imported? I imported "output/name/point_cloud/iretation_30000/point_cloud.ply" into meshlab as shown in the first image. I expected to be able to produce the effect shown in the second image. QQ截图20231106140428 图片1

@yangqinhui0423 Sir, did you solve this problem?

luoxue-star avatar Dec 27 '23 13:12 luoxue-star

@luoxue-star @hanhantie233 Briefly speaking, after training the Gaussians, we can achieve a *.ply file, which has spherical harmonics parameters attached to each Gaussian.

To recover RGB color in each Gaussian, we can use the SH2RGB function in the code.

To visualize the colored point cloud, I use the open3d package.

ljjTYJR avatar Dec 27 '23 16:12 ljjTYJR

@ljjTYJR Thank you for your response. I encountered some negative values when using the SH2RGB function to obtain color. I use [f_dc_0, f_dc_1, f_dc_2] as inputs for the SH2RGB function. Could you please tell me why?

luoxue-star avatar Dec 29 '23 04:12 luoxue-star

@ljjTYJR Thank you for your response. I encountered some negative values when using the SH2RGB function to obtain color. I use [f_dc_0, f_dc_1, f_dc_2] as inputs for the SH2RGB function. Could you please tell me why?

If you use the spherical harmonics in the training, using SH2RGB will only convert the base SH parameters with freedom=0

ljjTYJR avatar Dec 29 '23 08:12 ljjTYJR

@luoxue-star Hello, have you get the correct color? I have the same question QAQ

@ljjTYJR Thank you for your response. I encountered some negative values when using the SH2RGB function to obtain color. I use [f_dc_0, f_dc_1, f_dc_2] as inputs for the SH2RGB function. Could you please tell me why?

If you use the spherical harmonics in the training, using SH2RGB will only convert the base SH parameters with freedom=0

@ljjTYJR Could you please explain it detailedly? I use the following step to recover color but only a few points have color and not accurate

  1. use open3d to load ply file
  2. load f_dc_0 f_dc_1 f_dc_2 and use SH2RGB to recover color, and i only use the base SH parameters with freedom=0
  3. thus i get nx3 numpy and it still have negative values, then i use the same way as author torch.clamp_min
  4. normalize step 3 to [0,1] and multiply 255
  5. transfer to np.uint8
  6. xyz -> pcd.points colors->pcd.colors
  7. save pcd file and use cloud compare to visualize

it's the color point cloud i get QC)0}GCKGZ K9M _)47HD B

it's the dense colmap reconstruction point cloud (}V(A0 2~DDHL 3GV$H(U 7

I want to acquire the point cloud with color like colmap

ITBoy-China avatar Jan 25 '24 18:01 ITBoy-China

您好,您有正确的颜色吗?我有同样的问题 QAQ

感谢您的回复。使用 SH2RGB 函数获取颜色时,我遇到了一些负值。我使用 [f_dc_0, f_dc_1, f_dc_2] 作为 SH2RGB 函数的输入。你能告诉我为什么吗?

如果在训练中使用球谐波,则 using 将仅转换 freedom=0 的基本 SH 参数SH2RGB

你能详细解释一下吗?我使用以下步骤来恢复颜色,但只有几个点有颜色并且不准确

  1. 使用 Open3D 加载 PLY 文件
  2. 加载f_dc_0 f_dc_1 f_dc_2并使用 SH2RGB 恢复颜色,我只使用 freedom=0 的基本 SH 参数
  3. 因此,我得到 NX3 Numpy,它仍然有负值,然后我使用与作者相同的方式torch.clamp_min
  4. 将步骤 3 归一化为 [0,1] 并乘以 255
  5. 转移到 np.uint8
  6. xyz -> pcd.points 颜色->pcd.colors
  7. 保存 PCD 文件并使用 Cloud Compare 进行可视化

这是我得到的彩色点云 QC)0}GCKGZ K9M _)47HD B

它是密集的 Colmap 重建点云 (}V(A0 2~DDHL 3GV$H(U 7

我想获取像 colmap 一样具有颜色的点云

Sorry, I'm a newbie. How does the second step work? How to read color data? How to use the SH2RGB function?

GhostRenia avatar Feb 27 '24 07:02 GhostRenia

Did it get solved? I want to be able to view the RGB ply file in meshlab

JayNyce avatar Mar 28 '24 01:03 JayNyce

Did it get solved? I want to be able to view the RGB ply file in meshlab

Hello, I analyzed the source code, we have to extract the RGB information, the following is my code.

`path = "E:\RenProject\gaussian-splatting\output\cht\point_cloud\iteration_30000\point_cloud.ply" plydata = PlyData.read(path) xyz = np.stack((np.asarray(plydata.elements[0]["x"]), np.asarray(plydata.elements[0]["y"]), np.asarray(plydata.elements[0]["z"])), axis=1) features_dc = np.zeros((xyz.shape[0], 3, 1)) features_dc[:, 0, 0] = np.asarray(plydata.elements[0]["f_dc_0"]) features_dc[:, 1, 0] = np.asarray(plydata.elements[0]["f_dc_1"]) features_dc[:, 2, 0] = np.asarray(plydata.elements[0]["f_dc_2"]) f_d = np.transpose(features_dc, axes=(0,2,1)) f_d_t = f_d[:, 0, :] pcd = o3d.geometry.PointCloud() pcd.points = o3d.utility.Vector3dVector(xyz) pcd.colors = o3d.utility.Vector3dVector(f_d_t)

o3d.io.write_point_cloud("cht-color.ply",pcd)`

GhostRenia avatar Mar 28 '24 11:03 GhostRenia

Since the Gaussian kernel will exponentially fall off from the mean, there will be some color outperforming 1. Here is the code I used to deal with this:

        plydata = PlyData.read(gs_path)

        xyz = np.stack((np.asarray(plydata.elements[0]["x"]),
                        np.asarray(plydata.elements[0]["y"]),
                        np.asarray(plydata.elements[0]["z"])),  axis=1)
        features_dc = np.zeros((xyz.shape[0], 3, 1))
        features_dc[:, 0, 0] = np.asarray(plydata.elements[0]["f_dc_0"])
        features_dc[:, 1, 0] = np.asarray(plydata.elements[0]["f_dc_1"])
        features_dc[:, 2, 0] = np.asarray(plydata.elements[0]["f_dc_2"])
        rgb = SH2RGB(features_dc[..., 0])
        # clamp the lower bound of rgb values to 0
        rgb = np.maximum(rgb, 0)

        opacities = np.asarray(plydata.elements[0]["opacity"])[..., np.newaxis]
        opacities = self.sigmoid(opacities)
        opacity_mask = (opacities > 0.005).squeeze(1)
        xyz = xyz[opacity_mask]
        rgb = rgb[opacity_mask]

        # for point with rgb values large than 1, we need to rescale all channels by making the largest channel 1
        max_rgb = np.max(rgb, axis=1)
        max_rgb = np.maximum(max_rgb, 1)
        rgb = rgb / max_rgb[:, np.newaxis]

        # for checking
        pcd = o3d.geometry.PointCloud()
        pcd.points = o3d.utility.Vector3dVector(xyz)
        pcd.colors = o3d.utility.Vector3dVector(rgb)
        o3d.io.write_point_cloud("test.ply",pcd)

Hope this code can help others!

KzZheng avatar Apr 25 '24 08:04 KzZheng

I am relatively new to coding. for the line with opacities=self.sigmoid(opacities). What is the self and which class is it from?

JayNyce avatar Apr 25 '24 13:04 JayNyce

I am relatively new to coding. for the line with opacities=self.sigmoid(opacities). What is the self and which class is it from?

It is just a sigmoid function by using numpy:

def sigmoid(x):
      return 1 / (1 + np.exp(-x))

KzZheng avatar Apr 25 '24 20:04 KzZheng

可以参考gaussian_render中的init.py下的 # If precomputed colors are provided, use them. Otherwise, if it is desired to precompute colors # from SHs in Python, do it. If not, then SH -> RGB conversion will be done by rasterizer. shs = None colors_precomp = None if override_color is None: if pipe.convert_SHs_python: shs_view = pc.get_features.transpose(1, 2).view(-1, 3, (pc.max_sh_degree+1)**2) dir_pp = (pc.get_xyz - viewpoint_camera.camera_center.repeat(pc.get_features.shape[0], 1)) dir_pp_normalized = dir_pp/dir_pp.norm(dim=1, keepdim=True) sh2rgb = eval_sh(pc.active_sh_degree, shs_view, dir_pp_normalized) colors_precomp = torch.clamp_min(sh2rgb + 0.5, 0.0) else: shs = pc.get_features else: colors_precomp = override_color 把这段代码,复制到gaussian model.py的save_ply下,同时也要修改construct_list_of_attributes,save_ply加入一个save_ply(self, path, viewpoint_camera);train.py中的save也要加上scene.save(iteration, viewpoint_cam);同理scene的init下也要save(self, iteration, viewpoint_cam)。最后的代码是 def construct_list_of_attributes(self): l = ['x', 'y', 'z', 'nx', 'ny', 'nz'] # All channels except the 3 DC for i in range(self._features_dc.shape[1]*self._features_dc.shape[2]): l.append('f_dc_{}'.format(i)) for i in range(self._features_rest.shape[1]*self._features_rest.shape[2]): l.append('f_rest_{}'.format(i)) l.append('opacity') for i in range(self._scaling.shape[1]): l.append('scale_{}'.format(i)) for i in range(self._rotation.shape[1]): l.append('rot_{}'.format(i)) # Add color attributes l.append('red') l.append('green') l.append('blue') return l 并且save_ply中修改为 ` ......

If precomputed colors are provided, use them. Otherwise, if it is desired to precompute colors

    # from SHs in Python, do it. If not, then SH -> RGB conversion will be done by rasterizer.
    shs_view = self.get_features.transpose(1, 2).view(-1, 3, (self.max_sh_degree+1)**2)
    dir_pp = (self.get_xyz - viewpoint_camera.camera_center.repeat(self.get_features.shape[0], 1))
    dir_pp_normalized = dir_pp/dir_pp.norm(dim=1, keepdim=True)
    sh2rgb = eval_sh(self.active_sh_degree, shs_view, dir_pp_normalized)
    # colors_precomp = torch.clamp_min(sh2rgb + 0.5, 0.0).cpu().numpy()
    # colors_precomp = (torch.clamp_min(sh2rgb, 0.0)* 255).cpu().numpy()
    # 确保颜色值被正确映射到0-255的范围
    colors_precomp = (torch.clamp_min(sh2rgb + 0.4, 0.0)* 255).cpu().numpy()

    dtype_full = [(attr, 'f4') if attr not in ['red', 'green', 'blue'] else (attr, 'u1') for attr in self.construct_list_of_attributes()]
    # dtype_full = [(attribute, 'f4') for attribute in self.construct_list_of_attributes()]

    elements = np.empty(xyz.shape[0], dtype=dtype_full)
    attributes = np.concatenate((xyz, normals, f_dc, f_rest, opacities, scale, rotation, colors_precomp), axis=1)

......` 可以通过自己调整sh2rgb+XXX得到,后续用CloudCompare与meshlab都可以,meshlab中shading选择None或者Dot都OK。 1715073642603 或者使用CloudCompare,可以参考https://github.com/graphdeco-inria/gaussian-splatting/issues/674#issuecomment-1968267819,然后在属性栏点击RGB即可 1715073456929

F011011110100 avatar May 07 '24 09:05 F011011110100