Azure-Kinect-Sensor-SDK icon indicating copy to clipboard operation
Azure-Kinect-Sensor-SDK copied to clipboard

Potential depth misalignment between depth camera and rgb camera

Open Basilel7 opened this issue 1 year ago • 0 comments

I'm working on a project where I need to localize an object with RGB and Depth streams. I'm using ArUCO library to localize the object on the RGB image and I transform the Depth in a PointCloud in the RGB camera 3D space.

The object is supposed to be at the same location in the 3d RGB camera space but the the depth image of the object and the ArUCO pose estimaton of the object are about 1cm to 2cm of distance on the camera axis mostly. It's not likely from ArUCO pose estimation as previous experiments showed us it was more precise than that. So the error should come from depth estimation from the camera or from the depth alignment to the RGB camera space (depth/rgb intrinsics or depth camera to rgb camera Extrinsics).

bk2 bk

in red, the mesh project with ArUCO and in black the point cloud from the depth camera. Selecting directly the ArUCO marker center on the pointcloud and the ArUCO pose estimation confirms the depth offset.

I'm working on linux 20.04

Basilel7 avatar Aug 29 '23 14:08 Basilel7