OpenBVE
OpenBVE copied to clipboard
[Request&Suggestion] Some request about panel touch
Description
1. Get touch event on .Net plugin
We are developing the circuit breaker of the train, but the number of keyboard input is limited. So we want to get an event on the .Net plugin when we click/release TouchElement in Panel.xml or Panel.animated.xml.
Our idea is to add two functions that handle touch events(like TouchElementClicked/TouchElementReleased). These functions has two parameters: Number of Group(panel.animated.xml)/Screen(panel.xml) that TouchElement is in, and any number to pass to the plugin defined in panel xml file.
<Group>
<Number>0</Number>
<Touch>
<PluginCommand>1</PluginCommand>
</Touch>
</Group>
When click this touch element, following function in .Net plugin is executed.
public void TouchElementClicked(int group, int command)
// in this situation, group = 0, command = 1
TouchElementReleased is executed when mouse released, and it also has same parameters.
2. Display Several Groups
If possible, I also want to display several groups(screens) in panel.xml(including panel.animated.xml). It would be great when multiple touch screens each need to change the touch area. (e.g. Screen1's touch area needs to be changed but Screen2's touch area should be maintained.) If it is possible, we don't have to duplicate the code for each screen state.
I think it is possible to add a parent group that encloses an existing group and let JumpScreen work within the parent group, or add the group's initial visibility option and make JumpScreen hide itself and show the target group.
I think this is a blind spot that I did not assume when I discussed with S520 when implementing the touch element. I think this should be required.
Also I would like to be able to set plus/minus symbols of cursor in the xml file so that these symbols can be used for touch elements for plugin.
I think they want to would like to have an arbitrary cursor shape depending on the situation, such as a "+" when the switch is on and a "-" when it is off. I also same think it would be nice if that could be realized.
Yes. We are planning to make this switchboard in 3d cab, and apply panel touch feature to most switches, buttons, and screens in cab.
If this request becoems possible, I think touch feature will be useful in more diverse situations.
So I hope this request will be accepted!
Added custom cursors, which is easy enough. (Note: If you require transparency, this should be in-image. Not supporting color-key transparency here)
It seems to me that unless you're planning dummy switches for most of these, the whole cab may get a little busy. I'm not necessarily against adding the function you want, but I'm a little concerned about starting to duplicate code in different places.
Essentially, if you're just using these for animations, I'm not sure it's the best idea :) The whole plugin interface is somewhat of a mess, which has been repeatedly extended, and in an ideal world everything would actually be implemented back into the main sim.
I heard from them before their proposal, but they plan to simulate the all of electrical system on the DLL side, and in conjunction with that, we need to receive signals from touch events in order to respond to all switches. Therefore, I suggested that it should be proposed in this way. All of switches are not dummy, they must actually emit a signal and be able to be received by the DLL. One thing I haven't heard about is doing touch elements in 3D space. Now we can only do touch elements in 2D, is it? What I suggested was adding a touch event as the same as per frame etc, and setting the ATS number and the numbers 0-255 when touched. But they told me they wanted 0000-FFFF, not 00-FF.
Thank you for your work. Custom cursors work very well :)
Our plan is simulating all of electrical systems on .net plugin. (touch in 3D is working with panel.animated.xml)
But even if it is not an electrical system, the touch is also useful in the signal system, like doing an initialization operation(login with driver id, set destination, select signal system, etc).
There are numbers and function buttons on this screen, and we can easily control them with a touch.
If we use keyboard input, we have to remember the multiple keys to control them.
This can make it harder to control the train, but it will add fun to the play, and we can also provide different levels of difficulty.
Added.
~~A little lateral thinking- Would a new pair of events passing the actual keyboard key pressed / released be of interest? (Joystick is somewhat harder to do, but the keyboard might be interesting)~~
Nearly made a stupid mistake there....
Adding to the existing interface breaks the thing for existing plugins, which is bad..... Delegate methods is how we've solved this for most things, but they are for calling in the opposite direction.
Therefore there is now a new interface, designed for plugins wishing to recieve raw input from the gamewindow- IRawRuntime
This contains the following new methods at present: RawKeyDown - Called when the game window generates a raw key down event. RawKeyUp - Called when the game window generates a raw key down event. TouchEvent - Called when the game window generates a touch event.
Please currently consider this interface to be unstable and subject to change, even if included in a public release build until further notice :) Considering adding joystick button / axis events to it, but I don't really like this event too much.
Nearly made a stupid mistake there....
Adding to the existing interface breaks the thing for existing plugins, which is bad.....
Delegate methods is how we've solved this for most things, but they are for calling in the opposite direction.
Therefore there is now a new interface, designed for plugins wishing to recieve raw input from the gamewindow-
IRawRuntime
This contains the following new methods at present:
RawKeyDown - Called when the game window generates a raw key down event.
RawKeyUp - Called when the game window generates a raw key down event.
TouchEvent - Called when the game window generates a touch event.
Please currently consider this interface to be unstable and subject to change, even if included in a public release build until further notice :)
Considering adding joystick button / axis events to it, but I don't really like this event too much.
Thank you. It works good. But I don't know what value comes into commandIndex at TouchEvent. How do I set the commandIndex value?
The CommandIndex is actually the numeric index to the command (if set) in the following enum: https://github.com/leezer3/OpenBVE/blob/master/source/OpenBveApi/Interface/Input/Commands.cs#L8
If you want a touch-zone to perform a specific virtualkeys command (as opposed to jumpscreen), the Command can be set using it's textual name. https://github.com/leezer3/OpenBVE/blob/master/source/Plugins/Train.OpenBve/Panel/PanelXmlParser.cs#L506