Using a acceleration data from joy cons
According to ALVR for Vision Pro client developer, joycon acceleration data is sent by client, but not used by streamer. See https://github.com/alvr-org/alvr-visionos/issues/109 Can this be fixed?
I'm pretty sure this is something that should be handled by the visionOS client
No? This is prediction related, which is a server responsibility
Ok, the OP concern is about if acceleration info is used at all, which it is. So I don't understand the original issue. And yeah, probably the position would appear lagged at the moment (will be fixed relatively soon).
Sorry if my initial message was a bit unclear. When using the JoyCons together with the Vision Pro, the positioning of your hands in-game (let's take Half-Life Alyx as an example) is correctly transferred by the Vision Pro hand tracking and displayed via ALVR. Many users combine the Vision Pro with Joy-Cons though, to have more precise button control. These are connected to the Vision Pro over bluetooth and its buttons/triggers can be used in-game. The only issue is that acceleration data from "throwing objects" seems not to come through. So in Half-Life Alyx, for example, you pick up a bottle and make a throwing movement with your hands, but as soon as you release the grip, the bottle is not projected as expected, but drops vertically to the ground directly. According to the Vision Pro ALVR client developer, shinyquagsire23, this data is correctly sent by the client. Is there a way to make this work server side?
I'm running into the same issue. This makes many games like Punch Beat currently unplayable.
@maatjes2 AFAIK, on the android client velocity is correctly sent to SteamVR.
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.