Pen/Tablet Input Support
I propose we support pen tablet input, such as on devices like the Surface Pro/Studio, Galaxy Note, and eventually the Apple Pencil. These devices support things like pressure sensitivity and rotation/gyro. I don't think this is necessarily relevant to games (although it definitely can be), but it's definitely relevant to the larger digital 3D community.
There is an issue open on imgui (https://github.com/ocornut/imgui/issues/2372) that has a little documentation and discussion on how to handle pen input across platforms that might be relevant when work on this starts.
I made a preliminary implementation for pen support on X11 here https://github.com/DorianRudolph/winit/tree/stylus
Pen pressure can already be accessed on X11 via the raw axis events, but there you get no information about the range of values.
@DorianRudolph What is the state of your implementation? Any help needed? I could look into stylus support for Wayland but first we should probably make sure the API exposed in Winit is correct.
@lehmanju if you're interested in contributing Wayland bits, you may add helper for table input to https://github.com/smithay/client-toolkit , since it's something that will be required anyway.
@lehmanju I'm not working on this right now. I would consider it a proof of concept. The implementation works, but I did not give much thought to the API. It might make sense to implement https://github.com/rust-windowing/winit/issues/336 prior to tablet support.
A bit confused, but how would you consider it to be not implemented? It looks like it effectively is through Touch.
Pen input devices can have additional values, such as pen pressure, tilt, and azimuth that are not currently supported in winit. Additionally (and anecdotally), applications sometimes want to be able to distinguish between touch and pen input, even if most interactions end up being the same.
Touch has the force field which seems to define pressure sensitivity and tilt. I could see the case for the latter and that it only works on IOS and Windows 8+ right now, but wouldn't you still consider that stylus input is mostly supported?
That's interesting... Looks like it's only used for the Apple Pencil: https://github.com/rust-windowing/winit/blob/b9f3d333e41464457f6e42640793bf88b9563727/src/platform_impl/ios/view.rs#L233
I think we'd still need to hook up azimuth to do that cool tilt shading effect Apple showed off when they released the Pencil...
Although even more broadly than that, I still think it's probably worth it to separate it out into a more generalized Pen event and implement it for all the platforms.
Edit: this may be a breaking change--at least on Windows, if one explicitly asks for pen input, the OS will stop sending them as touch inputs.
It looks like Windows does have pressure sensitivity support: https://github.com/rust-windowing/winit/blob/66859607a35f7e46a25346511657d3b7ada939ca/src/platform_impl/windows/event_loop.rs#L1482-L1496
Rotation/tilt should be trivial to add as well.
I'm confused why normalize_pointer_pressure() expects to only get a maximum of 1024, though this issue seems to imply that the Win32 Pointer API is limited to that number compared to WinRT. It might be worth investigating that in the future?
I'll admit the last time I seriously used winit was pre-0.20, and pen support wasn't sufficient for any non-trivial pen-based apps (for reference on what's missing, here's a proof concept supporting pen on Windows and web here https://github.com/rust-windowing/winit/compare/2e11615...tangmi:winit-legacy-2). I don't believe that any significant support for pen/tablet input was added since then, so I think that it's meaningful to keep this issue open.
Agreed, it doesn't seem very mature yet.
Looking through @DorianRudolph's PR, it looks like pressure is normalized, so could we use the Touch event in a similar way to how it's done on Windows (instead of adding another event struct)?
Just want to indicate that I wish pen input to be in place
For reference, pointer prediction is also a thing:
- Web: https://www.w3.org/TR/pointerevents3/#predicted-events
- Windows: https://learn.microsoft.com/en-us/windows/windows-app-sdk/api/winrt/microsoft.ui.input.pointerpredictor.getpredictedpoints
- MacOS: https://developer.apple.com/documentation/uikit/uievent/1613814-predictedtouchesfortouch
Can we have this in foreseeable future? May I help?
It seems like the pieces are there for all platforms but wayland. So depending on your needs, you can already use a tablet with either mine or Atilogit's PR. Someone (maybe you) just needs to prepare a production ready PR, I think. Currently both PRs implement the pointer on top of the touch event, which is not ideal, so we should create separate events for pen inputs.
Can we have this in foreseeable future? May I help?
I'm not sure how near the future would be, but we lack the general API for that and the API for tablets is really complicated if you've read into what platforms exposed.
I think I have a rough understanding of how the wayland API works, and I'll try implementing this for wayland soon. It seems like how to report the events to the user hasn't been decided yet, so should I just use what's convenient? That way people can worry about the specifics after most platforms are implemented.
The main issue with all of those devices(mouse, touch, pen) is that they are not unified in winit, which a bit annoying. If you have an idea and you want to work on that you could probably try to propose top level API and an example first, without implementation in the backend.
There's also patches to add pen which you could probably take a look at #2647 , #2396, and #1879. So you could give some brief information what other backends have.
There's a tablet pad(or whatever it's called) to consider, because it also has events...
Being able to handle pen double-taps on my iPad would be amazing!
Thanks for the interest in working on this in https://github.com/rust-windowing/winit/issues/3759 @ActuallyHappening! I think the hardest part about this is not so much the implementation, but more figuring out what the generated WindowEvents should look like - maybe you (or someone else) could start by suggesting an API here, and then we can work from there?
I work primarily with the bevy wrappers around winit. From that point of view, the only event I would really care about looks like this (allowing for forwards compatibility):
#[non_exhastive]
#[derive(Debug)[
pub enum PenEvent {
DoubleTab,
// Squeeze // todo
}
Thinking about this more, I believe only apple pens have the specific 'Double Tap' feature. Of course there would be documentation to show this. Keeping the enum descriptive (rather than pub struct PenEvent { is_tap: bool }) seems a better design, but my only requirement (for my specific use case) is the ability to know when (an event) I double tap my Apple Pencil. It would also be good to wrap some specific information about the requested action is, see preferedTabAction https://developer.apple.com/documentation/uikit/uipencilinteraction/3039593-preferredtapaction
Resolution from the meeting today: We want to start with something matching the web API, specifically pointerType, twist and tilt[X|Y]. @daxpedda wants to try to do this implementation.
On top of that, we could probably add some extra methods to know if an event was e.g. a double tap or a special button press, but we want to start with the above.
OK interesting, though in the link to the web API you mentioned there is no functionality specific to Apple Pencils / Galaxy SPens buttons. I.e., after implementing the entire web API, I would still not be able to handle Apple Pencil double tap (or squeeze) events like I want to (see bevy issue and #3759).
Would this API be mutually compatible with the decided web API?
/// Pen events not necessarily tied to a normal touch, move, or drag e.t.c.
pub enum PenSpecialEvent {
DoubleTap { preferred_action: Option<PenPreferredTapAction> },
// Squeeze // todo
}
I guess I should explain that the Apple Pencil double tap feature often occurs when the pen is not touching the screen, so information like screen position doesn't make sense for it.
Maybe it would help to know what kind of events the pen generates when used in a web page?
Ahh I see, trying to mimic the Web API for native.
I initially tried to gain pen events on web, but special Apple pen events (i.e. DoubleTap) don't trigger anything on Web, which was one of the initial factors pushing me to use rust for native in the first place, so that I could use the features of my apple pencil. I haven't checked that this is still the case recently however.
Looking over the web api we could merge it with the features I want. From the pointerType property:
The event's pointer type. The supported values are the following strings: "mouse" The event was generated by a mouse device. "pen" The event was generated by a pen or stylus device. "touch" The event was generated by a touch, such as a finger. If the device type cannot be detected by the browser, the value can be an empty string (""). If the browser supports pointer device types other than those listed above, the value should be vendor-prefixed to avoid conflicting names for different types of devices.
I understand more what you mean by extending the official spec now:
On top of that, we could probably add some extra methods to know if an event was e.g. a double tap or a special button press, but we want to start with the above.
This does seem to be a more consistent and unified API.
The only problem I see with trying to add Apple Pencil double tap event support is that the event itself is not associated with any of the touch event types in the web API spec, here. And, it is required that each pointer event from the spec has a specific .clientX, .clientY e.t.c. from the MouseEvent interface however apple pencil double tap events don't have an associated position on the screen, because you are often holding the pencil off the screen when you double tap
I'm still eager to implement this, see #3768
The only problem I see with trying to add Apple Pencil double tap event support is that the event itself is not associated with any of the touch event types in the web API spec, here.
This is to be expected, there will be a bunch of special events that just don't fit anywhere and maybe are platform-specific with no overlap.
I think to start the design work of post-Web-API-pencil-input we want a proper overview.
- A list of relevant events for each backend.
- Find overlap between the backend.
- Consider which ones to implement or if we want/can let external crates to implement those.
Documentation:
- Android: NDK input
- iOS/MacOS: PencilKit?
- Wayland
libinput: API, documentation- Tablet protocol
- Windows: GetPointerPenInfo
- X11: x.org is down, couldn't search for it.
I'm thinking about my future considerations, I'm wanting to custom build a lot of stuff relevant only to iOS.
For that I should really only need a reference to the WinitUIView and WinitUIViewController and the ability to implement protocols on it, but I believe objc2 can only implement objective C protocols in the defining crate.
I'm out of my depth here, but is it possible for winit to expose the ability to add custom views and view controllers to the base / default without messing up any of winits abstractions?
That seems the cleanest and best option to me, since I can then maintain my own crates which implement specific features for iOS only rather than peppering winit with PRs for every new iOS specific change.
I think to start the design work of post-Web-API-pencil-input we want a proper overview.
- A list of relevant events for each backend.
- Find overlap between the backend.
- Consider which ones to implement or if we want/can let external crates to implement those.
I can only speak confidently about iOS since it's the only hardware I have. I'm happy to make a list of iOS features I would eventually want:
- Double Tap events + Hover Pose data
- Squeeze events
- Preferred action for tap and squeeze events
- Pen prediction