Azure-Kinect-Sensor-SDK icon indicating copy to clipboard operation
Azure-Kinect-Sensor-SDK copied to clipboard

Device position and orientation estimation with IMU and camera

Open spyoha opened this issue 4 years ago • 7 comments

I hope Azure Kinect can also provide its (relative)position and orientation information, not only raw acceleration value.(just like the Hololens can do splendidly)

I've tried to make these feature, based on raw IMU data that I can get from the latest SDK(1.3.0)

  • initial pose(orientation) calibration
  • update device movement(velocity)
  • tracking relative location to initial position
  • measurement error correction

But these problems made it challenging

  • sensor measurement error(noise)
  • hard to find out valid observation from the noise
  • constant force(earth gravity)

I wonder if anyone else already trying similar work for Azure Kinect

spyoha avatar Dec 02 '19 14:12 spyoha

With the existing hardware, it should be possible to provide some orientation (rotation) data just from the IMU sensors - enough to determine the pitch and roll of the camera. This would be very helpful even if the resulting numbers are a bit rough, but the same information can also be acquired with a floor estimation function. A built-in compass would add the yaw direction, at least relative to north (or relative to neighboring cameras) - and this would make it much easier to coordinate multiple cameras that overlook the same area. It would mostly resolve the orientation coordinates, so only the position needs to be solved.

Chris45215 avatar Jan 07 '20 21:01 Chris45215

We implemented a fusion algorithm (madwick and mahony) in order to obtain orientation from accelerometer and gyroscope data. It works fine for pitch and roll, but the yaw drift is very high (20° of yaw drift error in 1 minute). Do you plan to develop a more robust solution to obtain orientation of the sensor ?

PierrePlantard avatar Feb 29 '20 10:02 PierrePlantard

It is very important for vision application (robotics, mobile tracking) to provide a VIO (visual inertial odometry) method in order to get position and orientation of the camera. All the hardware components are already embedded in the Kinect Azure (IMU, depth camera, color camera), it would be a shame not to use them. All the competitor already provide such feature with the SDK of their depth cameras (intel realsense, Zed stereo-camera) and a lot of AR mobile development kit also (ArKit, ArCore,...). Do you plan to develop such feature ?

PierrePlantard avatar Apr 17 '20 16:04 PierrePlantard

https://feedback.azure.com/forums/920053-azure-kinect-dk/suggestions/38471407-imu-example-or-api https://feedback.azure.com/forums/920053-azure-kinect-dk/suggestions/40198582-please-provide-a-visual-inertial-odometry-vio-me https://feedback.azure.com/forums/920053-azure-kinect-dk/suggestions/39018925-add-slam-function

HiroyukiSakoh avatar Jun 25 '20 07:06 HiroyukiSakoh

We would like pose and position estimation for SLAM too, for scanning live environments (one of many possible uses in live VFX production). Without extensive research into this problem (and maybe still with noisey or biased data) the IMU data provided from the SDK is not useful, for us at least. We are not primarily a R&D lab so manpower into solving such problems is non-productive. Any solution or example material would be much appreciated.

neilDGD avatar Jul 08 '20 14:07 neilDGD

Any news regarding this?

dnlwbr avatar Dec 02 '20 09:12 dnlwbr

Were there any progress on this?

Thaina avatar Sep 15 '22 05:09 Thaina