Azure-Kinect-Sensor-SDK
Azure-Kinect-Sensor-SDK copied to clipboard
Potential depth misalignment between depth camera and rgb camera
I'm working on a project where I need to localize an object with RGB and Depth streams. I'm using ArUCO library to localize the object on the RGB image and I transform the Depth in a PointCloud in the RGB camera 3D space.
The object is supposed to be at the same location in the 3d RGB camera space but the the depth image of the object and the ArUCO pose estimaton of the object are about 1cm to 2cm of distance on the camera axis mostly. It's not likely from ArUCO pose estimation as previous experiments showed us it was more precise than that. So the error should come from depth estimation from the camera or from the depth alignment to the RGB camera space (depth/rgb intrinsics or depth camera to rgb camera Extrinsics).
in red, the mesh project with ArUCO and in black the point cloud from the depth camera. Selecting directly the ArUCO marker center on the pointcloud and the ArUCO pose estimation confirms the depth offset.
I'm working on linux 20.04