add support for touch events (WIP)
Opening this initial PR for feedback, not as a request for merging
There are a number of TODO items:
-
hardcoded scaling for mapping the touch co-ordinate system into layout co-ordinates (works on my Pinephone :-).
-
we should send events to the client no matter what the Lua handler returns (I think the code to not do this is broken anyway...). Instead, if the handler returns true we should send the client a touch cancel event.
-
maybe there's a better way to deal with seat capabilities than adding a "touchpads" field to kiwmi_input
To discuss: unlike the Sway touch implementation, we don't emulate pointer events in the server for non-touch-enabled clients. It adds a bit of complexity and I suspect it's unnecessary for "mainstream" client toolkits which do that client side. When I test this with Gtk apps, for example, if they don't request touch events then they get button events from touches.
- hardcoded scaling for mapping the touch co-ordinate system into layout co-ordinates (works on my Pinephone :-).
~~If i’ve grasped random comments that have flown around me correctly, wlr_input_device.output_name should be used to identify the output. But don’t take my word on that.~~
Ugh, apparently there’s wlr_cursor_absolute_to_layout_coords(), does that work?
To discuss: unlike the Sway touch implementation, we don't emulate pointer events in the server for non-touch-enabled clients. It adds a bit of complexity and I suspect it's unnecessary for "mainstream" client toolkits which do that client side. When I test this with Gtk apps, for example, if they don't request touch events then they get button events from touches.
What about programs like foot?
Ugh, apparently there’s
wlr_cursor_absolute_to_layout_coords(), does that work?
I'll take a look, thanks. There's a couple of different use cases here.
- your smartphone touch device should map directly onto the output for its internal LCD (probably called DSI-1 or something) even if you plug an external display into its USB-C
- you plug a wacom tablet into your dual-monitor desktop, maybe you want to be able to reach any point on either screen
I think Sway has magic (aka heuristics) to deal with this, I don't know what other compositors do. It may be we should expose it to Lua so the user can set policy
To discuss: unlike the Sway touch implementation, we don't emulate pointer events in the server for non-touch-enabled clients. It adds a bit of complexity and I suspect it's unnecessary for "mainstream" client toolkits which do that client side. When I test this with Gtk apps, for example, if they don't request touch events then they get button events from touches.
What about programs like foot?
Good question. Not currently, it seems - https://codeberg.org/dnkl/foot/issues/517
What about programs like foot?
Good question. Not currently, it seems - https://codeberg.org/dnkl/foot/issues/517
Well, it was more like ‘there still are popular programs facing wayland directly, which should also work to some extent’ 😉. On the other hand, it also depends on which ‘missing implementation’ you consider missing and which is ‘not in scope’ instead.
Well, it was more like ‘there still are popular programs facing wayland directly, which should also work to some extent’ wink. On the other hand, it also depends on which ‘missing implementation’ you consider missing and which is ‘not in scope’ instead.
Yes, but in the specific case of foot I use it myself - albeit not yet on a touchscreen - and thought I'd better check :-) My assumption (entirely untested) is that they won't work on touch-only devices in Gnome Shell or KDE either, so while it would be nice to have pointer emulation in the compositor, there's probably some incentive for the client authors to add touch themselves anyway.
I know that Wayland "design philosophy" changes from year to year, but this blog from 2017 to me makes a good argument https://blog.martin-graesslin.com/blog/2017/02/how-input-works-touch-input/