hubs
hubs copied to clipboard
Implement hand tracking using WebXR Hand Input API
What about implementing hand tracking using WebXR Hand Input API?
Here's how hand tracking looks like in Hand Tracking Sample from the Immersive Web Working Group's WebXR Samples on Meta Quest 2 device: https://web.dev/pwas-on-oculus-2/#hand-tracking.
How do you imagine walking or teleporting with hand tracking?
@elmau Using hand gesture. See https://youtu.be/fsxH-z8qx58. :wink:
That looks like a really good solution. I thought of having a "virtual cellphone" to handle chat or moving around the scene.
This isn't on our scheduled near-term roadmap, but we will keep our eyes on this potentially for the future. Thank you for reporting this.
Hand tracking is a very good thing. To use it daily in virtual reality applications, I must admit that it changes the use of virtual reality a lot. It would be really nice to be able to implement it on hubs :)
Agreed, for accessibility this would be really helpful, and lots of headsets are being sold without controllers now as well.
There's also this "walk" hand gesture which is pretty nice, and seems intuitive esp for folks who aren't used to hand gestures:
https://youtu.be/ebM1IEn_12U
Would a PR be welcome for something like this, perhaps modeled on these control methods?
https://github.com/mozilla/hubs/blob/dd1443f321eecdcabc9630c273e4d0ace0de6db2/doc/index.md#wasd-to-analog2d