openvr icon indicating copy to clipboard operation
openvr copied to clipboard

support for lip and eye tracking

Open gb2111 opened this issue 3 years ago • 10 comments

I know it might be early but is there any plan yet to add API for lip and eye-tracking? Thank you

gb2111 avatar Apr 13 '21 13:04 gb2111

I expect it would be through the input API, in which case whoever makes the first hardware with a SteamVR driver, would probably define what the paths are.

TheDeveloperGuy avatar Apr 13 '21 14:04 TheDeveloperGuy

You are right. That seems to be the correct way. Unless there is something specific that I don't see now. In the next couple of months it is up to HTC I guess.

gb2111 avatar Apr 13 '21 14:04 gb2111

I was asking this myself to be honest, especially since we have the Vive tracker providing blendshape data in one way and Apple AR providing it in another. Literally it would appear that that only consensus appears to be largely using shape key data instead of, or supporting facial landmarks.

Personally, I've forked the repo to have a look at adding in shape key floats as something like vr::VRInput()->GetFacialKeyData() mapped to values such as VRFacial_NoseSneer_R, VRFacial_EyeLook_U, VRFacial_MouthClose, etc. (to adapt a few examples from Unity/AR Kit)

It's probably beyond my ability to implement or influence to even get a PR accepted, but it's worth a shot.

Minothor avatar May 23 '21 15:05 Minothor

Well its been 7 months, now me and my buddy need facial tracking to be supported in our driver, and since there seems to be 0 progress on the topic, we'll do it ourselves.

We already have eye tracking working(and mouth tracking in the works) and hardware for it, the only thing we're missing is the driver implementation. And i''m responsible for the driver implementation 🥲

I'll try using the input API where possible, hopefully i won't have to reinvent the wheel...

okawo80085 avatar Nov 24 '21 19:11 okawo80085

@okawo80085 I'd like to follow your progress on that if that's possible? My own efforts on a Pi Zero based solution have stagnated a little since putting a hole in my 3D printer's polarising film.

Minothor avatar Nov 24 '21 19:11 Minothor

@okawo80085 I'd like to follow your progress on that if that's possible? My own efforts on a Pi Zero based solution have stagnated a little since putting a hole in my 3D printer's polarising film.

We started working on it very recently, we'll post general progress in this feature request and here

But most of the discussion related to this project is in discord, link to the server we hang out in

okawo80085 avatar Nov 24 '21 19:11 okawo80085

Small update on eye and face tracking, we decided to stick with events for now. Data formats needed for face and eye tracking are too complex for the current VRInput API, and the easiest way to send custom structs to applications is to make our driver send out vendor events, we didn't decide on the final structs that will be sent though.

okawo80085 avatar Dec 03 '21 07:12 okawo80085

Well, using events to pass full eye tracking state didn't work, so we switched to using events to signal if eye tracking is active or not and use shared memory to actually pass gaze state (shared memory is owned and created by the driver device), new driver device is almost done (still needs some cosmetic tweaks).

okawo80085 avatar Dec 06 '21 14:12 okawo80085