maui
maui copied to clipboard
GraphicsView Interactions and Gestures stop working when used together on Android
Description
On Windows the following Interactions and Gestures work as expected.
<ContentPage ...
x:Class="GraphicsViewInteractionsAndGestures.MainPage">
<Grid>
...
<GraphicsView x:Name="GView"
Grid.Row="0"
StartInteraction="StartInteraction"
DragInteraction="DragInteraction"
EndInteraction="EndInteraction">
<GraphicsView.GestureRecognizers>
<!-- All gestures must be commented out for above interactions to fire on Android-->
<TapGestureRecognizer Tapped="GView_Tapped"
Buttons="Primary" />
<PointerGestureRecognizer PointerMoved="GView_PointerMoved" />
</GraphicsView.GestureRecognizers>
</GraphicsView>
<Label x:Name="StatusLabel"
Grid.Row="1"
HorizontalOptions="CenterAndExpand"/>
</Grid>
</ContentPage>
But on Android neither work, except for an initial tap being detected but still not working properly (the tap is detected yet the point is not set):
Commenting out the gestures, and the interactions work on Android:
Steps to Reproduce
See .NET 7.0 reproduction project for the MainPage.xaml.cs.
Expected outcome: Gestures and Interactions to both work on Android as they do on Windows. Actual outcome: On Android, when not using Interactions, Gestures worked, and vice versa.
Link to public reproduction project repository
https://github.com/soyfrien/GraphicsViewInteractionsAndGestures
Version with bug
7.0.403 and 8.0.0-rc.2.9373
Is this a regression from previous behavior?
Not sure, did not test other versions
Last version that worked well
Unknown/Other
Affected platforms
Android
Affected platform versions
Android 13
Did you find any workaround?
By using platform conditionals to wire up the gestures in code, I was able to get a working solution that allows me to have extra right-click functionality on Windows. For example, in the constructor:
...
InitializeComponent();
#if WINDOWS
...
TapGestureRecognizer gViewTappedSecondaryGesture = new();
gViewTappedSecondaryGesture.Tapped += MethodNameFor_SecondaryTapped;
gViewTappedSecondaryGesture.Buttons = ButtonsMask.Secondary;
GView.GestureRecognizers.Add(gViewTappedSecondaryGesture);
...
#endif
So, the workaround is to not use Gestures with Interactions on Android.
Relevant log output
> dotnet workload list
Installed Workload Id Manifest Version Installation Source
--------------------------------------------------------------------------------------------------
android 34.0.0-rc.2.468/8.0.100-rc.2 VS 17.8.34219.65, VS 17.7.34221.43
maui-windows 8.0.0-rc.2.9373/8.0.100-rc.2 VS 17.8.34219.65, VS 17.7.34221.43
maccatalyst 16.4.8968-net8-rc2/8.0.100-rc.2 VS 17.8.34219.65, VS 17.7.34221.43
ios 16.4.8968-net8-rc2/8.0.100-rc.2 VS 17.8.34219.65, VS 17.7.34221.43
maui-maccatalyst 8.0.0-rc.2.9373/8.0.100-rc.2 VS 17.7.34221.43
maui-ios 8.0.0-rc.2.9373/8.0.100-rc.2 VS 17.7.34221.43
maui-android 8.0.0-rc.2.9373/8.0.100-rc.2 VS 17.7.34221.43
Verified this on Visual Studio Enterprise 17.8.0 Preview 5.0(8.0.0-rc.2.9373). Repro on Android 14.0-API34, not repro on Windows 11 with below Project: GraphicsViewInteractionsAndGestures.zip
I updated the workaround to set up gestures for Windows in code-behind using the platform conditionals, so the conflict can be avoided by letting all platforms get Interactions from XAML and gestures where they work from code.
I'm also having this Issue on iOS and Android but it's working fine on Windows. Please add this on RC2 if possible @samhouts
I just faced the same issue, and I think this is quite critical, because it makes things like PinchGestureRecognizer basically useless, as they're only meaningful on systems with multi-touch screens.
In the end it seems the only workable solution is to use the Interaction events for all platforms.
I have the same issue I was trying to create a repeat button like in wpf, but the PointerPressed and PointerReleased don't work on android (tried an image, a shape and a button) none of the recognizers work
Additional remark: For reasons unknown to me, TouchEventArgs does not contain a Buttons property, so for platforms that support multiple mouse buttons, platform-specific code is still necessary. It should be considered to add the Buttons property to this (and maybe similar) EventArgs classes.
This is still an issue on Android. iOS and Windows work as expected, Android you don't receive any events.
I face the same issue, is there any workaround?
I have just ignored the issue for now. My application works fine with zooming and panning on Windows and iOS, Android doesn't get any events at all. It worked a few versions ago. I don't know exactly when it quit working.
I was able to work around this problem by manually implementing the gesture handling. I've written an intermediate class that seems to make the GrapicsView really generic. Just use this class instead of GraphicsView and let your ViewModel implement IViewModelWithInteraction. In my case, the view model already contains the logic to handle mouse dragging the way it was done with Winforms or WPF, so the wrapper class translates everything to common Windows logic.
public class GraphicsViewWithMouseSupport : GraphicsView, IDisposable
{
private readonly IViewModelWithInteraction m_viewModel;
private PointerState m_lastButtonStates;
private double? m_lastDistance;
private PointF m_lastMousePosition;
private SizeF m_lastSize;
public GraphicsViewWithMouseSupport(IViewModelWithInteraction viewModel)
{
m_viewModel = viewModel;
m_viewModel.ControlSize = new Size(Width, Height); // Setting the control size to 0 or less throws exceptions later
BindingContext = m_viewModel;
Drawable = m_viewModel.GetDrawable();
SizeChanged += OnSizeChanged;
m_lastMousePosition = new PointF();
m_lastSize = new SizeF();
Unloaded += (sender, args) =>
{
if (BindingContext is IDisposable disp)
{
disp.Dispose();
}
};
m_lastButtonStates = new();
}
public IViewModelWithInteraction ViewModel => m_viewModel;
protected override void OnHandlerChanged()
{
base.OnHandlerChanged();
#if WINDOWS
if (Handler?.PlatformView is Microsoft.Maui.Platform.PlatformTouchGraphicsView x)
{
x.PointerWheelChanged += (s, e) =>
{
var pt = e.GetCurrentPoint(x);
int mouseWheelDelta = pt.Properties.MouseWheelDelta;
Windows.Foundation.Point p = pt.Position;
if (mouseWheelDelta != 0)
{
m_viewModel?.MouseMove(m_lastButtonStates, mouseWheelDelta / 60, new Point(p.X, p.Y), false);
}
};
x.PointerPressed += (s, e) =>
{
var buttons = ButtonsPressed(x, e, false);
var pt = e.GetCurrentPoint(x).Position;
OnStartInteraction(s, new TouchEventArgs(new[] { new PointF((float)pt.X, (float)pt.Y) }, true), buttons);
};
x.PointerReleased += (s, e) =>
{
var buttons = ButtonsPressed(x, e, true);
var pt = e.GetCurrentPoint(x).Position;
OnEndInteraction(s, new TouchEventArgs(new[] { new PointF((float)pt.X, (float)pt.Y) }, true), buttons);
};
x.PointerMoved += (s, e) =>
{
var buttons = ButtonsPressed(x, e, true);
var pt = e.GetCurrentPoint(x).Position;
OnDragInteraction(s, new TouchEventArgs(new[] { new PointF((float)pt.X, (float)pt.Y) }, true), buttons);
};
}
else
{
throw new PlatformNotSupportedException("This is supposedly windows, but the window type doesn't match");
}
#else
StartInteraction += (s, e) => OnStartInteraction(s, e, PointerState.LeftButton);
EndInteraction += (s, e) => OnEndInteraction(s, e, PointerState.LeftButton);
DragInteraction += (s, e) => OnDragInteraction(s, e, PointerState.LeftButton);
if (OperatingSystem.IsIOS())
{
// On ios, the above never reports more than one touch in TouchEventArgs
var p = new PinchGestureRecognizer();
p.PinchUpdated += (s, e) =>
{
PointF center = new PointF((float)(e.ScaleOrigin.X * Width), (float)(e.ScaleOrigin.Y * Height));
SizeF delta = new SizeF(1, 0);
if (e.Status == GestureStatus.Started)
{
m_lastSize = new SizeF(1.0f, 1.0f);
var ts = new TouchEventArgs(new PointF[] { center + delta, center - delta }, true);
OnStartInteraction(s, ts, PointerState.LeftButton);
m_lastDistance = null;
}
if (e.Status == GestureStatus.Running)
{
// Convert scale (relative to previous call!) to a distance left and right of the center
SizeF newSize = m_lastSize * (float)e.Scale;
delta = new SizeF(newSize.Width * 7, 0);
var ts = new TouchEventArgs(new PointF[] { center + delta, center - delta }, true);
OnDragInteraction(s, ts, PointerState.LeftButton);
m_lastSize = newSize;
}
if (e.Status == GestureStatus.Canceled || e.Status == GestureStatus.Completed)
{
var ts = new TouchEventArgs(new PointF[] { center + delta, center - delta }, true);
OnEndInteraction(s, ts, PointerState.LeftButton);
m_lastDistance = null;
}
};
GestureRecognizers.Add(p);
}
#endif
}
protected virtual void OnSizeChanged(object? sender, EventArgs args)
{
m_viewModel.ControlSize = new Size(Width, Height);
Invalidate();
}
protected virtual void Dispose(bool disposing)
{
if (disposing)
{
m_viewModel?.Dispose();
}
}
public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}
#if WINDOWS
private PointerState ButtonsPressed(Microsoft.Maui.Platform.PlatformTouchGraphicsView sender, Microsoft.UI.Xaml.Input.PointerRoutedEventArgs e, bool buttonUp)
{
var ret = new PointerState(false, false, false);
var properties = e.GetCurrentPoint(sender).Properties;
ret.Left = properties.IsLeftButtonPressed;
ret.Right = properties.IsRightButtonPressed;
ret.Center = properties.IsMiddleButtonPressed;
if (buttonUp)
{
// For MouseButtonUp, we have to inverse the logic. So if the button was previously pressed, we set the bit here
if (m_lastButtonStates.Left && ret.Left == false)
{
ret.Left = true;
m_lastButtonStates.Left = false;
}
if (m_lastButtonStates.Right && ret.Right == false)
{
ret.Right = true;
m_lastButtonStates.Right = false;
}
if (m_lastButtonStates.Center && ret.Center == false)
{
ret.Center = true;
m_lastButtonStates.Center = false;
}
return ret;
}
m_lastButtonStates = ret;
return ret;
}
#endif
private void OnStartInteraction(object? sender, TouchEventArgs e, PointerState buttons)
{
m_lastDistance = null;
m_lastMousePosition = e.Touches[0];
if (e.Touches.Length == 1)
{
m_viewModel.MouseDown(buttons, e.Touches[0]);
}
}
private void OnEndInteraction(object? sender, TouchEventArgs e, PointerState buttons)
{
m_lastMousePosition = e.Touches[0];
if (e.Touches.Length == 1)
{
m_viewModel.MouseUp(buttons, e.Touches[0]);
}
else if (e.Touches.Length == 2)
{
m_viewModel.MouseUp(buttons, e.Touches[0]);
m_lastDistance = null;
}
}
private void OnDragInteraction(object? sender, TouchEventArgs e, PointerState buttons)
{
m_lastMousePosition = e.Touches[0];
if (e.Touches.Length == 1)
{
m_viewModel.MouseMove(buttons, 0, e.Touches[0], false);
}
else if (e.Touches.Length == 2)
{
double distance = e.Touches[0].Distance(e.Touches[1]);
Point midPoint = new PointF((e.Touches[0].X + e.Touches[1].X) / 2.0f, (e.Touches[0].Y + e.Touches[1].Y) / 2.0f);
double delta = 0;
if (m_lastDistance.HasValue)
{
delta = m_lastDistance.Value - distance;
}
m_viewModel.MouseMove(buttons, (int)Math.Round(-delta), midPoint, true);
m_lastDistance = distance;
}
}
}
The interface:
public interface IViewModelWithInteraction : IDrawable, IDisposable
{
Size ControlSize
{
get;
set;
}
IDrawable GetDrawable()
{
return this;
}
void MouseDown(PointerState buttons, Point position);
void MouseUp(PointerState buttons, Point position);
void MouseMove(PointerState buttons, int wheel, Point position, bool pinching);
}
PointerState is a simple record:
public record struct PointerState(bool Left, bool Right, bool Center)
{
public static readonly PointerState LeftButton = new PointerState(true, false, false);
}
@samhouts @PureWeen @rmarinho @mattleibow @Foda Please can you look at it ? it has been months! GraphicsView control is one of the most important controls. I cant integrate Pinch / Pan / etc with GraphicsView.