Azure_Kinect_ROS_Driver icon indicating copy to clipboard operation
Azure_Kinect_ROS_Driver copied to clipboard

On "color spill", or "flying pixels"

Open francescomilano172 opened this issue 3 years ago • 4 comments

Describe the bug Using the factory calibration, point clouds exhibit “color spill”, or “flying pixels” at the object boundaries. This happens both with the latest version of the ROS driver and with the Azure Kinect Viewer. Can this problem be solved through manual calibration?

To Reproduce

Kinect Azure Viewer

  1. Launch k4aviewer and open the device.
  2. As View Mode select 3D and Color.
  3. Point the camera on any object and observe the color spill at the object boundaries.

/points2 topic from ROS driver

  1. Set the flags point_cloud and rgb_point_cloud to true in kinect_rgbd.launch. Also keep point_cloud_in_depth_frame to false, so as to have depth_to_rgb-like backprojection. This should be the choice that gives less color spill, as mentioned for instance here.
  2. Run kinect_rgbd.launch and look at the /points2 topic in RVIZ.

Expected behavior In the desired setup, I would need to have aligned and rectified RGB and depth frames and use camera intrinsics to retrieve a point cloud. In the point cloud, there should be no color spill.

Screenshots In all the following screenshots, colorspill on the edges of the pink ball can be noticed

  • Rviz screenshot of the /points2 topic (point_cloud_in_depth_frame set to true): color_spill_WFOV_rgb_to_depth_2022-01-31-17-05-07_points2_rviz
  • Rviz screenshot of the /points2 topic (point_cloud_in_depth_frame set to false): color_spill_WFOV_depth_to_rgb_2022-01-31-16-59-27_points2_rviz
  • Azure Kinect Viewer screenshot of the 3-D point cloud, 720p RGB, WFOV Depth: color_spill_WFOV
  • Azure Kinect Viewer screenshot of the 3-D point cloud, 720p RGB, NFOV Depth: color_spill_NFOV

Desktop:

  • OS: Ubuntu
  • Version: 18.04.6
  • Commit of Azure_Kinect_ROS_Driver: c0742b9e470c9e688d796029f10cb52e1a763a4a
  • k4aviewer Firmware: RGB Camera: 1.6.102, Depth Camera: 1.6.75

Additional context It is unclear to me whether color spill is a problem that can be avoided. Previous conversations seem to hint at the fact that this is an inherent limitation of the Azure Kinect (e.g., the paper mentioned here). On the other hand, some threads seem to suggest that the problem might be related to imperfect calibration/camera alignment (e.g., here and here) and that a custom calibration can yield better alignment between the RGB and IR cameras (e.g., here and here). However, the conversations are overall inconclusive, with mixed opinions (negative 1, negative 2, positive, unclear).

I also saw that there is now the possibility to manually calibrate the intrinsics of the cameras and use them through the ROS interface instead of the factory calibration (here and here). Can a custom calibration alleviate or fix this problem, or is it a hardware limitation?

Also, is this to some extent due to the interpolation that is introduced both when warping the depth image into the RGB frame (e.g., here and here) and when rectifying the images before backprojection (as happens in rgbd_launch/image_geometry/cv2.remap, see e.g., this issue)? In particular, different interpolation schemes produce very different results (see, e.g., here), but even the recommended nearest-neighbor interpolation which is used in rgbd_launch for rectification (here) does not solve the color spilling problem.

francescomilano172 avatar Feb 02 '22 15:02 francescomilano172

Thank you for the bug and details.

ooeygui avatar Feb 02 '22 20:02 ooeygui

If k4aviewer already shows this effect on the pointe cloud, then there is nothing that the ROS node can do for the point cloud on /points2.

You can use the original colour and depth image and the intrinsics+extrinsics in two ways:

  1. use the rgbd_launch nodes via kinect_rgbd.launch to do the rectification, registration and projection manually
  2. manually calibrate the camera and try if kinect_rgbd.launch provides better results afterwards

It may be sufficient to only adjust the extrinsic calibration between the colour and depth camera and keep the intrinsics as they are.

christian-rauch avatar Apr 19 '22 17:04 christian-rauch

Hi @christian-rauch, thank you for your answer. For number 1., yes, this is what we are already doing, following the procedure in https://github.com/microsoft/Azure_Kinect_ROS_Driver/issues/212. For 2., can you recommend any calibration procedure for the extrinsics between the colour and the IR camera?

francescomilano172 avatar Apr 20 '22 12:04 francescomilano172

Since https://github.com/microsoft/Azure_Kinect_ROS_Driver/pull/200, you can use the camera_calibration package to assign new intrinsic parameters. You can calibrate the RGB and IR cameras separately.

The extrinsic parameters are published as tf. But I don't know a procedure to does that automatically. You may have to figure that out manually by adjusting the extrinsic tf and comparing the quality of the registration.

christian-rauch avatar Apr 20 '22 13:04 christian-rauch