accessibility-insights-windows
accessibility-insights-windows copied to clipboard
Invoke ShowContextMenu on Nodes
Is your feature request related to a problem? Please describe. Narrator Touch has a way to invoke a context menu on the currently focused element by double-tapping with two fingers. It sounds like the underlying API to do this is IUIAutomationElement3::ShowContextMenu.
I don't see a way to simulate this gesture via Accessibility Insights. Not every development environment has touch capabilities, and with work becoming increasingly remote, it's harder to get to the office to borrow a touch device to validate this feature, so it's hard to validate changes that add or otherwise modify the behavior that responds to this gesture.
Describe the solution you'd like Similar to how Accessibility Insights lets us trigger and simulate the various patterns exposed on a UIA node (such as Invoke or Expand) it would be nice to have a way to trigger the context menu touch gesture as well.
Describe alternatives you've considered I have not tried any alternatives.
Additional context Please let me know if Acc Insights already has this capability, and teach me where to access it.
This issue has been marked as ready for team triage; we will triage it in our weekly review and update the issue. Thank you for contributing to Accessibility Insights!
Thanks for the feature request, @jesmores! This is currently not supported and we agree that this would be an interesting feature. This will also need some additional work to determine where it would fit in with the current user experience. We are leaving this feature as needs investigation to allow the community to up-vote if this is something they would like to have in Accessibility Insights for Windows.
This issue requires additional investigation by the Accessibility Insights team. When the issue is ready to be triaged again, we will update the issue with the investigation result and add "status: ready for triage". Thank you for contributing to Accessibility Insights!
Thanks for taking a look. Do you know of any alternative ways to simulate the touch gesture in the absence of a touch device?
@guybark, would you know?
This issue has been marked as ready for team triage; we will triage it in our weekly review and update the issue. Thank you for contributing to Accessibility Insights!
This issue has a confusing comment history; so I am trying to reset. I believe we need PM and design to tell us how we should implement the UX for invoking the context menu. Invoking the context menu itself should be too much work from an engineering perspective, on top of implementing the UX.
This issue requires additional investigation by the Accessibility Insights team. When the issue is ready to be triaged again, we will update the issue with the investigation result and add "status: ready for triage". Thank you for contributing to Accessibility Insights!
Unfortunately, this is out of scope given our current priorities.
Thanks for using Accessibility Insights!