VRTK
VRTK copied to clipboard
Vive controller - remote control
I would be very interested in being able to use the Vive controller as a remote controller, as with the space craft on the table in The Lab.
I already have the craft set up in the Unity Scene, controlled by the WASD keys, so any idea how I can grab and translate Vive controller position to those?
Check out this example scene:
https://www.youtube.com/watch?v=4J8abeLzH58&index=16&list=PLRM1b2lKjTbdFJtYv_SNAb3NvYp-Gl7nZ
It should do what you want.
Thanks for that.
Yes I had a play with the touchpad controller, but personally in terms of input it is nowhere near as intuitive nor gives as fine control as the Vive controller movement, as per The Lab drone.
Kind regards
Gary
On 31-May-16 6:02 PM, StoneFox wrote:
Check out this example scene:
https://www.youtube.com/watch?v=4J8abeLzH58&index=16&list=PLRM1b2lKjTbdFJtYv_SNAb3NvYp-Gl7nZ
It should do what you want.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/thestonefox/SteamVR_Unity_Toolkit/issues/116#issuecomment-222618659, or mute the thread https://github.com/notifications/unsubscribe/AKJ3QAdKvk0-hAMsDXou-Q0AWungtYDBks5qG-r4gaJpZM4IqLKM.
@noorbeast what do you mean by it's not as intuitive?
The example shows off how you use the touch locations on the touchpad as an input. If you want to use those values to control an object then that's up to the developer to implement it.
The toolkit isn't really intended to be a bunch of assets people can use, but a bunch of concepts people can take and develop off of as a platform.
Yes I get that.
What I am trying to convey is that using the whole controller as input, like the table drone example in the Lab, is really intuitive and gives fine control for that sort of game play, compared to traditional input like the touchpad. I hope that makes some sort of sense.
On 01-Jun-16 2:04 AM, StoneFox wrote:
@noorbeast https://github.com/noorbeast what do you mean by it's not as intuitive?
The example shows off how you use the touch locations on the touchpad as an input. If you want to use those values to control an object then that's up to the developer to implement it.
The toolkit isn't really intended to be a bunch of assets people can use, but a bunch of concepts people can take and develop off of as a platform.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/thestonefox/SteamVR_Unity_Toolkit/issues/116#issuecomment-222736380, or mute the thread https://github.com/notifications/unsubscribe/AKJ3QCIPZYlHGIL7QWvHNISn6Re-PkFXks5qHFv3gaJpZM4IqLKM.
@noorbeast ah ok I see what you mean, you're talking about the controller is actually moving the thing based on it's rotation, like one big gyroscopic remote control?
Absolutely, sorry I was so awkward trying to describe what I meant.
On 01-Jun-16 5:05 AM, StoneFox wrote:
@noorbeast https://github.com/noorbeast ah ok I see what you mean, you're talking about the controller is actually moving the thing based on it's rotation, like one big gyroscopic remote control?
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/thestonefox/SteamVR_Unity_Toolkit/issues/116#issuecomment-222788366, or mute the thread https://github.com/notifications/unsubscribe/AKJ3QBoKWKfSF6bMtqgOTFB_JPWYezp4ks5qHIaGgaJpZM4IqLKM.
@noorbeast As a quick side note we implemented this in our game but ended up going with the touchpad for accuracy. Using the controller gyro rotation can only really be used for inaccurate movement and requires a fairly high tolerance. Expect a lot of false positives as you'll get different results depending on if the player is sitting or standing, holding the controllers close or arms stretched, is facing the monitor straight on or not. Just my 2 cents. Hope you can get it working in your situation though :)
Cheers
That is interesting.
Everyone I have had trying out the drone in The Lab, including a drone pilot, have commented on how simple and well it works for them, though that does nor need precision, most get the biggest kick out of teasing the Robo Dog with the drone.
On 01-Jun-16 11:25 PM, x-mugen-x wrote:
@noorbeast https://github.com/noorbeast As a quick side note we implemented this in our game but ended up going with the touchpad for accuracy. Using the controller gyro rotation can only really be used for inaccurate movement and requires a fairly high tolerance. Expect a lot of false positives as you'll get different results depending on if the player is sitting or standing, holding the controllers close or arms stretched, is facing the monitor straight on or not. Just my 2 cents. Hope you can get it working in your situation though :)
Cheers
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/thestonefox/SteamVR_Unity_Toolkit/issues/116#issuecomment-222991069, or mute the thread https://github.com/notifications/unsubscribe/AKJ3QKeWRUQ6jy1oYmRTtlZvNh0jfrxsks5qHYhOgaJpZM4IqLKM.
@noorbeast Yes for general movement it may be ok. For something like gestures using the gyro it just wasn't working in our setup. The accuracy of the controllers is great but seeing how different players hold them throws the math out a bit. We started with a tolerance of 10% then upped it to 20% which worked ok but wouldn't engage more often than not, upping it to 30% and we were getting too many false positives in the gesture recognition. We would set it up one way then the player would hold the controllers slightly different or not face the monitor directly and it wouldn't engage or would engage too much. For general movement it might be ok as you have a wide margin to play with (in degrees) and doesn't really need much accuracy. I guess if you have a calibration setup at the beginning of your game you can see what the player calls horizontal or vertical or pitched, then set an offset to the controller reading in your game :)
That is exceedingly helpful advice.
It does flag some potential issues issues for something else on my todo list that I wanted to experiment with, motion cancellation for motion simulators, where the movement to the motion of the sim is tracked by mounting the controllers to it, then subtract the simulator moment from the HMD visuals.
I don't know if it would work at all, but as the most active member of the DIY motion sim community I do know it is desperately needed to make the most of VR and motion simulation. And the alternative for the Rift (VectionVR) is not practical, as requires compiling with the game source code.
From what you have said at a minimum there would need to be a calibration process for the controllers mounted to the motion rig. Do you think the tracking would be accurate enough to use to cancel the motion from the HMD view in real time? Latency is the killer.
Kind regards
Gary
On 02-Jun-16 9:44 AM, x-mugen-x wrote:
@noorbeast https://github.com/noorbeast Yes for general movement it may be ok. For something like gestures using the gyro it just wasn't working in our setup. The accuracy of the controllers is great but seeing how different players hold them throws the math out a bit. We started with a tolerance of 10% then upped it to 20% which worked ok but wouldn't engage more often than not, upping it to 30% and we were getting too many false positives in the gesture recognition. We would set it up one way then the player would hold the controllers slightly different or not face the monitor directly and it wouldn't engage or would engage too much. For general movement it might be ok as you have a wide margin to play with (in degrees) and doesn't really need much accuracy. I guess if you have a calibration setup at the beginning of your game you can see what the player calls horizontal or vertical or pitched, then set an offset to the controller reading in your game :)
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/thestonefox/SteamVR_Unity_Toolkit/issues/116#issuecomment-223157491, or mute the thread https://github.com/notifications/unsubscribe/AKJ3QK2lkQHzdfCIv8QMXSOtJDq0W-4hks5qHhlCgaJpZM4IqLKM.
@noorbeast That's a very interesting project. I assume the motion rigs are all custom built by the community members and therefore there is not a set standard for where to place the motion controllers therefore I'd have to agree you'd need a calibration setup in your game, which wouldn't be difficult to implement at all. The tracking is incredibly accurate once calibrated. Using the controllers tracking data to offset the HMD view in realtime shouldn't be an issue. If I understand correctly what you're trying to achieve, you'd simply be grabbing the controller's rotational vector3's and (perhaps including a middle man for calibration / offset) then inverting the vector3 values and applying that rotation to the in game (HMD) camera every frame. Much how video stabilization software works, track the motion and invert the movement and apply to the original camera, voila stable videos. There's probably a bit more to it than this but this is how I'd start.
Cheers
Yes @x-mugen-x that is a good summary of what I had in mind.
Is it worth having some new events on ControllerEvents things like:
Yaw(rotation)
- when you move the controller in a left/right motion (0 degrees being straight forward)
Pitch(angle)
- when you tilt the controller in an up/down motion (0 degrees being straight forward)
Roll(rotation)
- when you rotate the controller in left/right motion (whilst looking ahead)
Then these values could be used to control something.
It's basically the rotation values of the controller but tidied up in a neater way.
@thestonefox This would definitely be useful. Not sure if it would be doable but a smoothing (interpolation) multiplier would be nice. I.e if it set to 0.0 there'd be no smoothing (could be jittery when applied to objects) if set to 1.0 have some interpolation to smooth it out. This would help in a lot of cases especially like your plane demo image, vehicle control, anything where you're controlling something of significant size or need accuracy, like a combination lock on a bank vault. Just an idea. I just noticed in my project the values were jittery. Cheers!
@thestonefox Excellent idea, would you hold the controller as you would a joystick? In fact is it possible to get the controller to emulate a 5 button Joystick?
@thestonefox I think this can be closed.
I still quite like this idea though
https://github.com/ExtendRealityLtd/Zinnia.Unity/issues/427