Touchscreen support
Return touchscreen support for Gnome
I am on gnome 47 wayland as well. no button is accessable through touch ? Any solutions ?
found a workaround: use flag: --vo x11 or put in mpv.conf: vo=x11
some issues are there though....
We don't have any platform specific touch code in uosc, so if it doesn't work somewhere, it's probably mpv or desktop environment issue.
old version worked 5.1.1
I don't have a gnome touch device so can't do much, but if you manage to pinpoint a specific version or ideally a commit that broke it, that might be helpful.
On KDE Plasma 6.3 buttons respond to touch only when mouse is hovering somewhere over mpv window. Tap anywhere to play/pause DOES work at any time, it is just buttons that don't. I also use another script, mpv-touch-gestures, which allows touch input at any time.
You can try and investigate yourselves. Our whole cursor handling is in a single file: cursor.lua
The binding and handling of mpv events starts at the bottom: https://github.com/tomasklaen/uosc/blob/a52305485029970247ced840952f47e62715a71f/src/uosc/lib/cursor.lua#L401
From glancing at it I do see stuff like:
if not mouse then return end
But that was added by @christoph-heinrich (who also made the above linked mpv-touch-gestures) to fix some touch/cursor issues, so I guess there's a reason behind that line. I can't recall specifics as it never affected me, and I'm afraid to just remove stuff like this without understanding all the (side)effects of it as it might reintroduce issues it was put there to fix.
I'm not sure what it all does exactly, but I did go in and try commenting out different parts of the --movement section, none of which fixed the issue (some of it prevented mouse from working too).
One more piece of info, when the buttons do respond to touch on KDE Plasma, with the mouse somewhere in the window, you have to tap buttons twice to get them to activate – that is, controls above the timeline; buttons inside any pop-up menus work normally, although also require the mouse to be over the window for them to work. You don't need to tap the exact control twice, you can tap nearby and then tap the control, and if the mouse is already nearby, the control works with one tap.
The way I am accessing those controls via touch is with the setting controls_persistency=paused, otherwise if the mouse wasn't in the window I wouldn't be able to see them. My intuition is that the controls still require proximity to work even though they are displayed? The mouse can be anywhere in the window for the controls to register touch, but this would also explain the need to tap the controls or somewhere nearby twice – the first tap activates proximity.
Try to replace uosc/lib/cursor.lua with this: cursor.lua.zip
In case it works, test if it doesn't break window dragging, speed/volume dragging, menu clicks and scrolling (when navigating directories/playlists) etc. We'd also need to check the compatibility with other GUI scripts that rely on mbtn events, but can't think of one.
If the file above works, the only remaining issue I see now is that you still need to tap elements like speed/volume before you can start dragging them. I don't think I can do anything about that without mpv introducing a completely new event handling API that'd allow us to decide whether window dragging should be prevented inside the event handler instead of before the event happens as it is currently.
From glancing at it I do see stuff like:
if not mouse then return end
That's just a nil check, because a user reported a crash at some point. It's a callback for observe_property().
I currently don't have a touch device (other then my phone), so I can't test this.
So I've spend more time on this than I'd like today, and to quote the commit above:
Seems like down events fire BEFORE mouse-pos and touch-pos are triggered, which means our down event handlers can't tell where the cursor actually is to properly triage the event.
The only fix I see is moving everything to up events, which is not an option since people always complain about unresponsive UI when we do that, so I'm giving up on this for now.
Still, this has some code cleanups and useful changes, so it's at least a refactor instead of a fix. And touch is somewhat usable now (if you tap enough times on something, you get a click!). ref
Today I've tried implementing a touch support option that would move all events handlers to up events, but tons of stuff was still broken. I just can't do anything without mpv telling me where the cursor is on down event.
So unless someone shares some API knowledge that unlocks this data for me or mpv corrects the order in which events are fired, touch support will remain broken.
Have you tried using the touch-pos property?
I haven't done anything with it so far (no working touch device), so no idea how well it works, but whenever the user touches somewhere it should up as a property update.
Yeah the commit above implements touch-pos, but as explained above, its broken because mpv triggers DOWN events BEFORE it triggers TOUCH-POS, so there's no way for me to know where the cursor is at the time of the DOWN event to properly triage it. I even tried to get the touch-pos property via mp.get_native_prop during the down event, and even there it was stale.
What I meant was to only listen to touch-pos. So when a touch-pos update happens, that IS the down event for touch input.
Now the question is how to know that the mouse down event is actually from touch, so that it can be ignored... maybe a small timeout? but that doesn't seem like a robust solution.