uno
uno copied to clipboard
[X11] Scroll using touch screen isn't working
Current behavior
Any scrollable section (ListView, ScrollViewer) can't be scrolled on Skia/X11 (Desktop) using gestures (touch screen). Worked fine in 5.1 and lower. Tested SDK 5.3.96 and 5.4.0-dev.180, same issue.
Expected behavior
Dragging a scrollable section using finger should scroll the content
How to reproduce it (as minimally and precisely as possible)
Create new Uno App, in MainPage.xaml replace the TextBox and Button with this content:
<TextBox Text="{Binding Name, Mode=TwoWay}"
PlaceholderText="Enter your name:" />
<ScrollViewer HorizontalScrollMode="Disabled" VerticalScrollBarVisibility="Visible" Height="100">
<StackPanel Orientation="Vertical">
<Button Margin="8">Hej 1</Button>
<Button Margin="8">Hej 2</Button>
<Button Margin="8">Hej 3</Button>
<Button Margin="8">Hej 4</Button>
<Button Margin="8">Hej 5</Button>
</StackPanel>
</ScrollViewer>
<Button Content="Go to Second Page"
AutomationProperties.AutomationId="SecondPageButton"
Command="{Binding GoToSecond}" />
And run on Desktop/Skia and try to scroll the section using touch screen. Note that I have only tested this on Raspberry Pi with a touch screen, I don't have touch on my Windows laptop.
Workaround
No response
Works on UWP/WinUI
None
Environment
Uno.UI / Uno.UI.WebAssembly / Uno.UI.Skia
NuGet package version(s)
SDK 5.3.96 SDK 5.4.0-dev.180
Affected platforms
Skia (Linux X11)
IDE
Visual Studio 2022
IDE version
No response
Relevant plugins
No response
Anything else we need to know?
No response
For reference I can scroll by dragging the scrollbar on the side, but I used to be able to scroll by dragging any area of the scrollable region.
This is due to the current lack of proper support for touch events. On X11, we currently interpret touch events as mouse inputs. This matches your findings. Dragging the scrollbar works because it also works with a mouse. Dragging a scrollable region with a mouse doesn't scroll it though, so it currently doesn't work on X11. Usually, the difference between marking inputs as mouse or touch inputs is not very significant for most cases, but this is an edge case in which it does matter.
@ramezgerges Thanks for the explanation! Just to help my understanding, can you explain why this was changed when we went from the (now) legacy GTK support to Desktop support, I was under the impression that Uno talked directly to X11 before as well, but I'm probably wrong on that assumption. Did GTK handle touch events before? I'm just curious :) Obviously it would be great to get this issue resolved, it's currently a showstopper to move to Desktop (vs legacy GTK) for me. I noticed there are some merge conflicts in #15799, is that all that's missing for it to be considered for merge?
@HakanL previously we have built on top of GTK, so it provided us with a layer of abstraction so we didn't have to take care of touch ourselves. That is now no longer the case, so we need to figure it out properly
I see, so GTK is no longer a dependency? Is there a roadmap to support Wayland directly as well?
@HakanL we have #17600 to track Wayland support