mouse-actions
mouse-actions copied to clipboard
I just wanted to say thank you
Truly. I NEED mouse gestures because of hand problems, and I have been stuck on Windows (with strokesplus) for nearly 10 years waiting for this app (and nvidia drivers ahem) to happen so that I could go Linux/Wayland. I've been here since 0 stars, using it in a VM, in conjunction with ydotool, to make sure it would all behave, and have spent the past two weeks making the heavily customised bare-metal Linux machine a reality - enough to be able to create a github account and write this! hahaha.
All the while, I have been harbouring a deep gratitude toward you for having made this app happen. Thank you so much, you've made a disabled old linux dev very happy, and filled a gaping void in accessibility on Wayland.
I'm working on packaging this (and related tools such as ydotool) using OBS for OpenSUSE (Tumbleweed specifically, but it should build for any OS after that is working), which turns out to be quite the challenge since OpenSUSE/RPM packages in a non-networked sandbox, so neither npm nor cargo are straightforward, so it is..... a learning experience ;) But I have had some success, and hope to be able to return the favour to you through this packaging/repository/distribution method at least, and also I have a few FRs and potentially PRs, if you are interested in collaboration? I would really like to help out if I can in any way, and there are a couple of features I'd really like to add if you are interested :)
Thank you very much for your message, it makes me very happy! I take it as a reward for having taken the extra steps of distributing and improving the tool.
I am of course interested in collaborations ! But about releases, I really like the principle of having a single binary that contains everything. That said, I can imagine packages that add docs and a "mouse_action.desktop" to facilitate integration of the application into the system, and automatic installation of the necessary dependencies (webkit2gtk, gtk3 for the GUI).
But I don't see a dependency on ydotool or anything else. Maybe I'm not thinking of a feature that the tool lacks?
A potential evolution (visible at the bottom of the Readme use rdev send() ?
), is to directly manage the sending of keyboard and mouse events: use "send()" from "rdev" which uses "evdev" which passes through "/dev/uinput" as "ydotool" does. Is this what you have in mind ?
The priority of this feature could be moved up.
In order to package this application, I think it would be necessary to clean it up and stabilize it, because even though it works well overall for me, it still crashes regularly.
I'm glad to know that the tool works on Wayland, I haven't tested it much on it! Some work could also be done to hide what isn't possible to do on Wayland (edges events).
having taken the extra steps of distributing and improving the tool.
I really do appreciate it :)
I am of course interested in collaborations ! \
That's great, I will put some time into this very soon. My PC still needs some fixing up, but I won't be long ;)
But about releases, I really like the principle of having a single binary that contains everything.
I did have a few problems using the appimage, but you already know about this - it's about the automatic detection of wayland and applying the -n switch as applicable. When I used the appimage to load the GUI (with -n, and that works), then save my changes and exit the GUI, it would load the main tool without the -n to no-listen, and it would fail, so I needed to use the tools separately so I could specify that argument.... But you have mentioned this wayland auto-detection in the readme already ](https://github.com/jersou/mouse-actions#low), so maybe that's something I can help with, so please don't feel that I am putting the load on you to do this or other work, I'll be happy to try and make it just however you'd like it :)
That said, I can imagine packages that add docs and a "mouse_action.desktop" to facilitate integration of the application into the system, and automatic installation of the necessary dependencies (webkit2gtk, gtk3 for the GUI).
The idea of packaging this in the OpenSUSE OBS is that they have a kind of chain where a package starts out in a user's 'home' repo (like mine is now), and then can be kind of 'promoted' until it reaches the main repository for the entire distribution, and then any user who installs the distro will find it in the default package management tools by searching. So at first it provides me a way to test out any changes I make with automatic builds (just like how GitHub Actions does already, I noticed you added that recently, I saw you also stumble on the same build dependencies I did) and also to target multiple DE's automatically with the same build script (eg different paths for libs and config and executable files, different package types; RPM, DEB, etc) but in the end it makes the tool available to everyone on OpenSUSE, which is something I'd really like to provide to the community.
That said, I can imagine packages that add docs and a "mouse_action.desktop" to facilitate integration of the application into the system, and automatic installation of the necessary dependencies (webkit2gtk, gtk3 for the GUI).
'Great minds think alike' hehe, my package does this... mostly because I'm on KDE so these were not present alrready ;)
But I don't see a dependency on ydotool or anything else. Maybe I'm not thinking of a feature that the tool lacks? A potential evolution (visible at the bottom of the Readme
use rdev send() ?
), is to directly manage the sending of keyboard and mouse events: use "send()" from "rdev" which uses "evdev" which passes through "/dev/uinput" as "ydotool" does. Is this what you have in mind ?
Sorry this paragraph is long, I will put TLDR at the end in case you are busy!
Yes this is exactly it. I want to be able to send keypresses, for example I do a diagonal line and it presses ctrl+w to close the tab. A different diagonal line gesture would minimise the window, and another would maximize/restore the window, and I had thought to do this with keypresses passed to the WM. One thing I used to have, was a gesture where I hold the gesture key (right mouse button for me) and click a link, and it would open in a new tab, as if holding ctrl and then clicking left-click where the gesture was performed. That could be done with rdev send I'm sure, although it also requires one of my feature requests, which is that the start and end coordinates of the gesture are available as arguments to the command called (in this example, so rdev knows where to send the ctrl+click). That same 'gesture button+click' gesture, had a double-duty, it would open a new browser tab, if there was no link under the mouse, by detecting which pointer type was being displayed, but that might be beyond the abilities of linux - although it could likely be scripted if the coordinates are available to pass to an external script.
This gesture and also another I use a lot (hold gesture button and then scrollwheel up or down = ctrl+tab/ctrl+shift+tab, for scrolling through tabs) also require another feature request, which is to allow mouse buttons as modifiers to gestures, as well as only keyboard key modifiers like it has now. I know that this can work as a 'click' event (rather than an 'shape') on the scroll up/down button with keyboard modifiers, so if I could use mouse buttons as modifiers, I can have my old one-handed control again.
TLDR Rdev send events would be great!!!!! I would like to see two new features especially: 1) mouse buttons as modifiers for gestures 2) gesture start/end coordinates as variables to use in arguments to the gesture command
The priority of this feature could be moved up. That would be awesome :) But please don't feel rushed, and maybe I can help out too very soon once I have my PC in working order and I'm ready to cut some code.
In order to package this application, I think it would be necessary to clean it up and stabilize it, because even though it works well overall for me, it still crashes regularly. I must admit I have not really used it for extensive periods yet, mostly short bursts of "I wonder if X is possible?", so I haven't seen it crash unless it was 100% operator error, but I agree stability will be important.
I'm glad to know that the tool works on Wayland, I haven't tested it much on it! Some work could also be done to hide what isn't possible to do on Wayland (edges events).
Well, I have only used Wayland, some of its features for monitor handling are driving forces for my switch to linux, so I've never run mouse-actions in X at all! Fortunately I am using KDE Wayland and edge events are a feature built into the window manager already, so I didn't miss out there. I think that Sway and Gnome have similar features built-in also, and I'm sure that I've seen a stand-alone app for sway/wlroots compositors to do mouse edge actions, so perhaps it won't be something that's so badly needed on Wayland? Also, just for the record, it does require a 'real' mouse, so if you test mouse-actions in a virtualbox guest, you have to disable the virtualbox mouse integration so that the VM grabs the mouse, or pass through a second mouse via USB. Now that I'm on linux, instead of virtualbox I'll be using QEMU/KVM/libvirt based VMs from now on, so I'll soon know how it behaves there, and I'll let you know.
The only problem I did see, was that sometimes the clicks for the gesture were not 'intercepted' before the WM saw them, so the gesture would be performed but also a right-mouse-button-click would be registered by the WM. I'm not sure if that's Wayland-specific, but I imagine it is, I imagine that you would have stopped that if it were effecting you on X also. That was my last FR really, to intercept the gestures' inputs and make sure they were not passed on to the WM. I'm not even sure if this is possible though, I'm not yet familiar with rdev.
I feel like I am making lots of work here, and I hope it doesn't seem rude of me. I am totally prepared to do any and all of this all by myself if that is needed, so please don't feel any pressure from any of this, I don't want to create any difficulty for you. I just wanted to make sure that you were OK with the idea of other people working with your creation, and not just be a 'cowboy' about it and go dropping FRs and PRs or forking, uninvited and unexplained :) Sometimes devs prefer to be asked to do the work themselves, sometimes devs prefer others to do the work and submit PRs, some devs are more like "don't touch my thing if you don't like it how it is, fork your own" so I wanted to just try to be polite and ask which way you like things to work.
Whichever way, I just want you to know that I am really grateful for your sharing this with us all, and maybe if I can and you would like, I can return the favour by contributing in some way, maybe with new features, or at least just by making the app natively available to all of OpenSUSE.
Big thank you from me too :) Great job. And now as they killed khotkeys in KDE6 I think that this tool is the only one left supporting mouse gestures on Linux... So maybe the userbase for the app will increase rapidly.
I use mouse gestures (supported by khtokeys till now) mainly in browsers in order to avoid any "mouse gesture" extensions... but sometimes i use gestures on Konsole and others.
The way I see it - there are a couple of things to be done more, so the app becomes even better:
- Tighten the security by avoiding the need the user to be in the input/plugdev group. The udev rule for uinput as far as I am aware is OK - even Steam uses it... Right now I am using an ugly hack by systemd (described in a comment here: https://github.com/jersou/mouse-actions/issues/12#issuecomment-2002521993) but probably a better solution can be done by using a system service and user client model... I am not that good with coding that anyway..
- An option to target the gesture/clicks/command etc. to a specific app/window (like in khotkeys) would be great because one and the same gesture may need different shortcuts for different apps. right now this can be partially accomplished by using the xdotool options like search, windowfocus etc... but it is too inconvenient and on the level of the command and it best be on the level of detecting where the gesture is done....
Of course packaging is important also but @pallaswept is doing a great job in OBS... Thanks for that to him also.
Hope this was useful :)
I just updated my Linux Mint to v22 Wilma and EasyStrokes stopped working. Instead of investing the time to get this sadly abandoned software running again, I'm looking into mouse-actions. A bit steep learning curve, given that the Readme.md only gets you halfway there... but I got the first strokes running and I'm quite happy to hopefully have a working, maintained gesture mechanism again.
So also Thank You from my side!
Regarding open issues, especially those 'only' on the documentation side, would you be interested in outside help?
Thank you for your feedback!
For the need for the doc, I'd like help :-)
Regarding the fixes and development, I plan to work on them as soon as I have some free time. I still use Mouse-actions extensively myself, and I am surprised that the most annoying bug for me is not (much) reported in the issues: MA crashes/bugs caused by adding/removing devices.
I would like to refactor device management, possibly by removing the rdev library.
For #56 that you also commented, I fixed the bug in 292310a8e3b08db1a51db59d20a9f120e026eb80 but I haven't merged/release it yet