ALVR icon indicating copy to clipboard operation
ALVR copied to clipboard

*Updated* Gesture bindings + Speech Recognition? Game issue/Suggestion

Open AINXTGENStudio opened this issue 1 year ago • 0 comments

Project Demigod

Description

(2 main issues)Quest 2 Handtracking Gestures: Trigger gesture(Main issue) won't pull/swing(can activate/hold trigger to shoot web and hold, but unable to pull or swing).(2nd main issue) the System button and the trigger at the same time in steamvr constantly activate and spam screenshots while doing almost anything especially hands down side of waist (Other gestures, minor issues)Navigation gesture tough to walk left(left hand navigation joystick to the left). Rotation gesture tough to rotate right(right hand navigation joystick to the right). Joystick click and both pinch thumb and middle/ring randomly activate(most likely user error). Finally tried your project today and very impressive overall! Virtual Desktop only has trigger and grip hand gesture bindings, but are able to hold trigger and pull/swing very well. If you could fix your trigger gesture, hand tracking would actually be much closer to fully playable! So close! :)

General Troubleshooting

  • [x] I carefully followed the instructions in the README and successfully completed the setup wizard
  • [x] I read the ALVR GitHub Wiki

Environment

Hardware

Note: for Linux, an upload to the hw-probe database is preferred: hw-probe -all -upload

CPU: intel(R) Core(TM) i7-9750H CPU @ 2.60GHz 2.59 GHz

GPU: Nvidia 1660ti

GPU Driver Version: 31.0.15.5222

Audio: Realtek(R)

Installation

ALVR Version: v20.8.1

ALVR Settings File: ?(Downloaded windows streamer & sidequest alvr on 06/18/24

SteamVR Version: 2.6.2

Install Type:

  • [ X] Packaged (exe, deb, rpm, etc)
  • [x] Portable (zip)
  • [ ] Source

OS Name and Version (winver on Windows or grep PRETTY_NAME /etc/os-release on most Linux distributions):

I am really looking forward to testing your project! And immediately thought that if possible, your clever gestures + Speech Recognition would exponentially expand controller capabilities to even surpass the conventional controllers themselves! For example: Y/B = Pinch thumb and middle, combined with voice commands you could theoretically have infinite options/customizations.

And as I'm writing this, brings yet another idea, though for upcoming capable hardware. Gesture bindings + Speech Recognition + eye tracking! To maybe inspire you to add this to future iterations. Gesture would activate the binding, voice command would populate corresponding menu/map and eye tracking would choose desired options and so on. Allowing more dedicated gestures for more fluent movement.

Also are we able to configure our own preferred gestures to this? If not, customizable gestures would without a doubt be an explosive catalyst to this project and would literally become viral asap. If yes, how could we create our own customized gestures?

Anyways, hope these suggestions help further your project and really hope this original project works the way I imagine it would. Thank you for your contributions so far and good luck on your next goals!

AINXTGENStudio avatar Jun 18 '24 03:06 AINXTGENStudio