HDAugmentedReality icon indicating copy to clipboard operation
HDAugmentedReality copied to clipboard

DeviceMotion instead of Accelerometer?

Open inorganik opened this issue 7 years ago • 4 comments

Hi,

I'm just curious why you chose to use accelerometer data and CLLocation heading, instead of CMDeviceMotion, which composites accelerometer, gyro, magnetometer?

Currently, the annotations movement have a latency, so that when the camera moves, it takes a moment for the annotations to catch up. I'm guessing this is caused by the latency of the accelerometer.

I recently discovered the app FlightRadar which has a very precise AR view. It displays annotations for where planes are, over a camera view, very similarly to this lib. But the annotations move completely in sync with the camera view. My guess is that they are using CMDeviceMotion, which has much more precise tracking, in combination with location and heading. I asked them about how they did it but all they would tell me was that it's all done in-house and has been in the app "for 6 years".

I don't mean to downplay what you've accomplished here, clearly you've solved a lot problems with math that is over my head. I'm just wondering if this could be adapted to use the three Euler angles in CMDeviceMotion attitude (pitch, roll, yaw) to present annotations in a more responsive way. Perhaps you could map a reference point in degrees given by roll to the users heading, and with that reference point, calculate the difference between the current roll to have snappier left-right marker movement.

Despite the imminence of ARKit, this lib still has great relevance, for multiple reasons:

  • calculates azimuth of annotations
  • could potentially work on the front camera (ARKit can't)
  • better compatibility because it works on devices running less that iOS 11 and devices that do not have the A9 chip

Thanks for all your hard work on this.

inorganik avatar Aug 16 '17 22:08 inorganik

Hello,

The latency you are seeing is caused by low pass filters in ARTrackingManager, not the accelerometer readings.

Horizontal movement (heading) I am not using accelerometer or gyro here, only raw CLLocationManager's heading value and applying low pass filter with default filter factor of 0.05 which causes the latency. CLLocationManager handles a lot of local/internal/external magnetic field calculations for heading so I believe this is the best way, also CLLocationManager is probably assisted by gyro internally.

If you want to scale down the filtering for the heading you can use something like this: arViewController.trackingManager.headingFilterFactor = 0.5 arViewController.trackingManager.locationManager.headingFilter = kCLHeadingFilterNone

Vertical movement (pitch) I am using accelerometer here, these readings don't have latency but they jump like hell, so aggresive filtering is needed(this adds the latency). If I remember correctly, I tried using CMDeviceMotion.attitude.pitch but couldn't differentiate up from down or something like that, gave up too quickly probably. There is plenty of room for improvement here. You can try this for the moment: arViewController.trackingManager.pitchFilterFactor = 0.5

Overall There is plenty of room for improvement here, low pass filter is really primitive and is probably not the best choice. I'll try to compare accelerometer vs gyro for pitch when I get back from vacation.

Thank you very much for your input, it helps a lot.

DanijelHuis avatar Aug 17 '17 12:08 DanijelHuis

I found you can get pitch that differentiates between leaning forward or backward if you use quaternions:

if let quat = motionData?.attitude.quaternion {
    let qPitch = CGFloat(self.radiansToDegrees(atan2(2 * (quat.x * quat.w + quat.y * quat.z), 1 - 2 * quat.x * quat.x - 2 * quat.z * quat.z)))
}

qPitch will equal 0 when the device is flat on it's back, 90 when standing up and 180 lying on it's face. The opposite hemisphere is 0 to -180. That may help with vertical movement.

inorganik avatar Aug 17 '17 15:08 inorganik

I'm finding that with the settings as you suggested:

arViewController.trackingManager.headingFilterFactor = 0.5 arViewController.trackingManager.locationManager.headingFilter = kCLHeadingFilterNone

The annotations get stuck if I move the camera too fast.

Quaternions to the rescue! I found that using Quaternions for roll is pretty bulletproof. You can also use them for yaw:

let qRoll = CGFloat(self.radiansToDegrees(2 * (quat.x * quat.y + quat.w * quat.z)))
let qYaw = CGFloat(self.radiansToDegrees(atan2(2 * (quat.y * quat.w - quat.x * quat.z), 1 - 2 * quat.y * quat.y - 2 * quat.z * quat.z)))

I was experimenting in a different app using these values to move a view around and it's very precise. qRoll is 0 for wherever the device is facing when motion tracking begins. If you simply mapped 0 to a heading, you could have a very precise left-right movement with no latency.

inorganik avatar Aug 17 '17 16:08 inorganik

Heading What do you mean by annotations getting stuck? I just tried it and it works very good even with 300 annotations on screen, no latency at all. Testing on demo project with iPhone 7, iOS 10.

I don't think there is any need for calculating heading from motion manager, values are not the same ( https://github.com/foundry/MagnetoMeter ) and as I can see the readings from CLLocationManager are very stable so it probably uses gyro/accelerometer internally.

qRoll is 0 for wherever the device is facing when motion tracking begins. If you simply mapped 0 to a heading...

You can set reference frame to CMAttitudeReferenceFrame.xMagneticNorthZVertical or CMAttitudeReferenceFrame.xTrueNorthZVertical. This way you don't have to map it manually and no problems with syncing/recalibrating gyro and compass.

Pitch Regarding pitch, I tried it and it definitely gives more stable results than accelerometer. Small problem is device orientation, I think motion manager must be restarted on orientation change. I will probably implement this. Thanks!

DanijelHuis avatar Aug 17 '17 23:08 DanijelHuis