eye-tracking-ios-prototype
eye-tracking-ios-prototype copied to clipboard
Preferred geometry of face to iPhone?
Hi,
I have downloaded built and deployed your app; the results are a bit "wild" as it were - the displayed attention point on the screen does not well match what I am actually looking at.
Is there a defined / preferred geometry - the face at one meter, centered on the screen. for example? If so, I will flange that up with cardboard and duct-tape. reading through the code does not give much in the way of a hint for such.
Thanks in advance,
-Rob-
Was just about to ask a similar question... I was wondering if the height compensation has to do with a preferred perspective. Can you explain that value a bit more?
After some experiments with Apple's example, it looks like the EyeTransform properties of ARFaceAnchor are very dependent of the orientation my your face. I can maneuver my face into an orientation, where viarakri's eye tracking seems to work ok. But it's not a very practical solution.
Take a look at #4, hopefully this fixes some problems inherent with face tracking and the AR world.
I am doing some R&D. I want to detect the size of pupil of eye using camera, any suggestions?