jitouch
jitouch copied to clipboard
Send/receive gestures over Universal Control
Examples: Scenario 1: Mac with Jitouch installed shares magic trackpad/keyboard with second mac without Jitouch installed. Use a jesture. Expected behavior: Nothing, second mac should behave as if I do not have Jitouch installed. Actual behavior: Focus suddenly jumps back to host machine and starts doing odd things, like renaming files.
Scenario 2: Mac without Jitouch installed shares magic trackpad/keyboard with second mac with Jitouch installed. Use a jesture. Expected behavior: Expect Jitouch on 2nd machine to intercept and interpret gesture. Actual behavior: Nothing, second mac behaves as if I do not have Jitouch installed.
I know that universal control is sending gesture data over, since I can perform all system gestures just fine. Interestingly, if the host machine has "Swipe up with 3 fingers" to view Mission Control, and the second machine has "Swipe up with 4 fingers" to view Mission Control, the host machine behaves as expected but the second machine can view Mission Control by swiping up with both 3 or 4 fingers.
Obviously, the second scenario is more of a feature request, but it feels like a bug.
Thanks for reporting this. Jitouch doesn't support Universal Control yet, and it probably won't be possible to send gestures to another device unless Apple adds an API to send messages over UC.
Jitouch (and all the other gesture apps) use an undocumented framework, MultitouchSupport.framework, to detect raw touch events on multi-touch devices attached to the machine. This is actually a separate system from the one that handles UI events like clicks and mouse movement, which is separate from the UC system that sends events from Controller to Target. The events that can be sent and received over UC are much more restricted than those that can be accessed locally through the MultiTouch framework. MacOS may have defined events for the built-in gestures, but that doesn't mean custom ones will work.
For Jitouch over UC to work, Jitouch would have to be running on the Controller and Target. On the Controller, it would have to detect that UC is active, and then when a gesture is performed, send events over UC to the correct Target device. On the Target, Jitouch must listen for events over UC so that it can perform the corresponding action. However, as far as I can tell, there is no third-party access to the UC system, so I can't send or receive the custom events needed for this to work.
Regarding scenario 1, this should be simpler to fix – all I need is for Jitouch to detect if UC is running and the focus is on a different computer. However, I can't find a way to interface with UC even for this simple check, yet. Hopefully there should be a way to do this though.
What are the gestures + actions you performed to get odd behavior like renaming files? There are some actions where focus is grabbed at the mouse location or current window, which probably don't reflect UC. I don't have a second mac to test this on, so my ability to support UC is limited.
Well, I do have two Monterey M1 machines that I'd be willing to use and help. I'll get Xcode installed and start poking around. If you want me to supply and logs or anything, I can also do that.
Sure, you are welcome to take a look. My issue so far is I can't even find a way to programmatically detect if the mouse is currently being sent to a different computer over Universal Control.
One of the functions that would probably have to change is getMousePosition
. It gets used in a lot of actions.
https://github.com/aaronkollasch/jitouch/blob/572bed758e6d38b0671a1b53f2d269539c21cfde/jitouch/Jitouch/Gesture.m#L203-L209
Here's a log of a short usage Universal Control between two Macs, if you're interested. https://pastebin.com/PrFwF5r5 You'll see on the first few lines where it declares the hot zone of my main monitor "7795E397" and knows that the target destination monitor is "4647F032" connected to my second machine. Later on it makes a "Took Platform Assertion", and cancels the the remote machine's "V-Apple Internal Keyboard / Trackpad", which may be a way for Jitouch installed on the client machine to detect that UC is being received. Not sure yet how to tell from the host that UC is sending the mouse input away.
As for the bizarre behavior I had reported before, it was simply my gestures firing off while programs they were not meant for were active. (Duh, I should have known that).
Can you run this XCode project on the client machine with and without UC connected? https://github.com/aaronkollasch/MTDeviceDump It will list the multitouch devices currently attached. If a new device appears while UC is being received, there is a chance the client could receive Jitouch gestures. If not, there is probably no way to receive gestures over UC beyond those Apple has defined.
2022-08-17 10:33:12.949530-0500 MTDeviceDump[15452:444713] Device 0 deviceID: 144115188075855919 familyID: 105 driverType: 4 version: 1332 productName: Apple Internal Keyboard / Trackpad dimensions: 22 x 30 surf. dim.: 157.800 mm x 97.800 mm opaque: false built-in: true is HID device: true
Output doesn't change when UC is running...
Is that on the sender or receiver machine? Sorry if I wasn't clear. If the MT device doesn't show up on the receiver, then gestures over UC will be very hard to support.
It seems that UC doesn't use the private MultitouchSupport.framework - see the output of otool -L /System/Library/CoreServices/UniversalControl.app/Contents/MacOS/UniversalControl
, though perhaps /System/Library/PrivateFrameworks/UniversalControl.framework/Versions/A/UniversalControl
does use the framework. I guess Apple would have to modify MultitouchSupport.framework to also accept a simulated device from UC, when the easier (and more efficient) approach is for UC to send only gesture events when they occur, not the raw stream of finger locations that the MT framework provides.
The other option is to scan the private UniversalControl framework for symbols that could be useful, in /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX.sdk/System/Library/PrivateFrameworks/UniversalControl.framework/Versions/Current/UniversalControl.tbd
, and then reverse-engineer them, similar to what was done for MultitouchSupport. Not sure UC sends the information Jitouch needs though.
I'm going to classify this as wontfix unless/until we find that UniversalControl.framework exposes an API that's similar enough to MultitouchSupport.framework that it can be plugged in with minimal changes (i.e. it allows for a callback function like trackpadCallback).
I'm also making this the "Send gestures to another machine via Universal Control" issue. Feel free to open a new issue if you find something like Scenario 1, where you see wonky behavior while Universal Control is active, but only for local gestures on the same machine.