OSVR-Leap-Motion
OSVR-Leap-Motion copied to clipboard
Requirements for the gesture interface
Refer to the gesture Interface Class Description and Summary documents. These seem to be in the discussion phase -- I don't think there is an API available for gestures yet.
Some notes from the docs:
A gesture can be considered as an action that progresses over period of time. During a gesture a number of attributes correspond to it that will make it up such as joint type (hand, head, etc.), orientation, position, acceleration and velocity. Together these attributes logically make up a gesture entity that provides meaningful information about user's body part(s) pose. Gestures can be semantically separated into two types : discrete and continuous.
Leap Motion: Watches for activity for typical gesture. Within a frame it can detect a gesture (event object) and continue it for multiple frames until it stops. It detects the following gestures: [see list...]
For our initial work, the gesture interface has a lower priority than the imaging, tracker, and config-related interfaces. Let's use this ticket to discuss details and decide how/when to proceed with gestures.
Gesture interface is now up and running
https://github.com/OSVR/OSVR-Specs-and-Proposals/tree/master/Interface%20Class%20Specifications/Gesture
No need to wait :)
Hi @yboger, I was looking for a "GestureInterface" class in the PluginKit (https://github.com/OSVR/OSVR-Core/tree/master/inc/osvr/PluginKit), but I didn't see one. What is the expected way for the plugin to send gesture events/information?
@zachkinstner my mistake. The code is ready and it is being checked in this morning. Stay tuned.
You might want to take a look (and comment, if you wish) on this pull request https://github.com/OSVR/OSVR-Core/pull/181 which includes the gestures
@zachkinstner , you can checkout gesture branch before it's merged into master and build it. It would have the code that you are looking for. This would also be a good way to review the pull request and comment on it.
A review/summary from the plugin perspective...
The C interface:
osvrDeviceGestureConfigure()
- OSVR_DeviceInitOptions opts
- OSVR_GestureDeviceInterface *iface
- OSVR_ChannelCount numSensors
- returns OSVR_ReturnCode
osvrDeviceGestureReportData()
- OSVR_GestureDeviceInterface iface,
- const char *gestureName,
- OSVR_GestureState gestureState,
- OSVR_ChannelCount sensor,
- OSVR_TimeValue const *timestamp
- returns OSVR_ReturnCode
The C++ device interface:
sendGestureData()
- OSVR_GestureState gestureState
- std::string const &gestureName,
- OSVR_ChannelCount sensor
- OSVR_TimeValue const ×tamp
- returns void
OSVR_GestureState can be either OSVR_GESTURE_IN_PROCESS or OSVR_GESTURE_COMPLETE.
Preset gesture strings:
#define OSVR_GESTURE_SWIPE_LEFT "SwipeLeft"
#define OSVR_GESTURE_SWIPE_RIGHT "SwipeRight"
#define OSVR_GESTURE_SCROLL_UP "ScrollUp"
#define OSVR_GESTURE_SCROLL_DOWN "ScrollDown"
#define OSVR_GESTURE_SINGLE_TAP "SingleTap"
#define OSVR_GESTURE_DOUBLE_TAP "DoubleTap"
#define OSVR_GESTURE_PINCH "Pinch"
#define OSVR_GESTURE_FINGER_SPREAD "FingerSpread"
#define OSVR_GESTURE_CIRCLE "Circle"
#define OSVR_GESTURE_LONG_PRESS "LongPress"
#define OSVR_GESTURE_OPEN_HAND "OpenHand"
#define OSVR_GESTURE_CLOSED_HAND "ClosedHand"
So usage might look like this:
mGestureDevice.sendGestureData(OSVR_GESTURE_IN_PROCESS, OSVR_GESTURE_CIRCLE, 0, &time)
Should there be a way to provide numeric data about the gesture?
- For gestures that are "in progress", it could be useful to show how far it is toward completion.
- For something like swipe or scroll, the speed of the gesture might be important.
- For something like a pinch, the distance/amount of pinch might be important.
The commit referenced above includes a new Gestures class. For now, this class simply outputs the gesture information to the console. Once the OSVR gesture interface is ready, the plugin can connect to it.
Regarding the numeric data that I mentioned in my previous comment, I'll summarize the data that seems most important for the Leap Motion gestures.
All Leap Motion gestures have:
- Type: Circle, Swipe, KeyTap, or ScreenTap
- State: Start, Update, or Stop
- Duration: time elapsed during the gesture motion
The other information can be factored into five values:
| Value | Type | Circle | Swipe | KeyTap | ScreenTap |
|---|---|---|---|---|---|
| Position | vector | :ballot_box_with_check:center |
:ballot_box_with_check:startPosition |
:ballot_box_with_check: | :ballot_box_with_check: |
| Direction | vector | :ballot_box_with_check: | :ballot_box_with_check: | :ballot_box_with_check: | |
| Progress | float | :ballot_box_with_check: | :ballot_box_with_check: | :ballot_box_with_check: | |
| Speed | float | :ballot_box_with_check: | |||
| Radius | float | :ballot_box_with_check: |
@zachkinstner the gesture data table is exactly right. And, without this data it seems very odd to provide gestures at all. That being said, there has been an internal discussion about separating support of gestures from the main Leap service. So, I would be happy to have an integration that does not include gestural support.
without this data it seems very odd to provide gestures at all.
Agreed. The position and direction information, in particular, seems vital to provide. Looking at the table, we could potentially combine direction and speed into one "velocity" value.
I would be happy to have an integration that does not include gestural support.
I created new milestone tags, and gave this issue the "Later Release" milestone.