Mizer
Mizer copied to clipboard
[New entities] DMX and Fixture values
Thanks for the project! It's nearly exactly what I've been thinking of building. :)
Unrelated: Is the Rust part integrated with FFI or as a separate process to the Flutter interface? Please document what "entity" means in the docs; it isn't immediately obvious. Maybe "data type", "signal type" or "port type" could be better?
See also: #120
New entities
- DMX: The values of a single DMX512 universe
- Multi-universe DMX: DMX times any number of universes
- Fixture values: Per-fixture (defined in the Fixtures view) values like colors and blend mode overrides
New nodes
Fixture dope sheet (#161)
Properties: Dope sheet data per fixture: Colors, color blend mode override (can be from presets) Inputs: Timecode Outputs: Fixture values
DMX and Fixture values mixer
Properties: Number of inputs/layers and blend modes (e.g. last write wins, brightest wins, additive)
Inputs:
- DMX/Fixture values per-layer
- 0-1 float multiplier per-layer
Outputs: DMX/Fixture values
DMX realization
Realizes Fixture values to DMX with data defined in the the Fixtures view.
Properties: Fixture set (?), universes/patch (?) Inputs: Fixture values Outputs: Multi-universe DMX
DMX output
Properties: DMX connection in Connections view Inputs: Multi-universe DMX
Future development/ideas
I have no idea how to connect this to a more traditional UI like sequencers/cue lists, and globally activatable scenes. They could have their own source node "manual/traditional/global inputs" outputting fixture values.
- Variable timings in dope sheets
- How to integrate sequencers/cue lists. Probably by making a switcher node from timecode or an integer...
First of all I'm glad you like this project :)
But I'll have to admit, I'm not quite sure what your feature request is. I'm not quite sure where you've found the "Entity" name as it's only used internally in the command line parser which isn't referenced in the documentation.
If this is what you're referring to I suppose you want to directly write dmx values using the command line?
For the new nodes, I think the dope sheet is kinda close to the existing timecode feature, but more powerful. The timecode feature is sadly not in a state where I would like it to be. I would like it to also support playback of audio and video as well as improve the editor so you don't have to record timecode values but can use a spline editor as well. But as I'm not using timecode a lot these features got prioritized down.
The DMX and Fixture values mixer would be very hard to implement in the current codebase as the node system has currently no deterministic execution order. So the mixer would ignore some of the values that would be written to it. This is why I was thinking about the issue #120 but I'm currently not sure whether this is the right way to continue.
DMX Realization on the other hand sound's quite nice but would always be delayed by one "frame" as the dmx output is calculated at the end of each frame while the nodes are executed in the middle.
A DMX Output node already exists but it only accepts a single channel for a single universe.
But I'll have to admit, I'm not quite sure what your feature request is.
If this is what you're referring to I suppose you want to directly write dmx values using the command line?
This feature request is about computing DMX and fixture values fully inside the node system. Basically a duplicate of #120, with some concrete definitions.
I'm not quite sure where you've found the "Entity" name as it's only used internally in the command line parser which isn't referenced in the documentation.
Reading source code and #161.
For the new nodes, I think the dope sheet is kinda close to the existing timecode feature, but more powerful. The timecode feature is sadly not in a state where I would like it to be. I would like it to also support playback of audio and video as well as improve the editor so you don't have to record timecode values but can use a spline editor as well. But as I'm not using timecode a lot these features got prioritized down.
The DMX and Fixture values mixer would be very hard to implement in the current codebase as the node system has currently no deterministic execution order. So the mixer would ignore some of the values that would be written to it. This is why I was thinking about the issue #120 but I'm currently not sure whether this is the right way to continue.
In my mind there are two evaluation strategies: pull and push. Pull is good when you have have a specific rate of output, say audio and video. Push is good when you want reactive stuff like button clicks.
Example data flow graph:
flowchart LR
A(["Timecode source 1
<br/>At every tick returns a monotonic timestamp"]) -->|"Timecode (continuous)"| B(Dope Sheet)
B -->|"Fixture data (discrete)"| C[DMX Realization]
C -->|"Multi-universe DMX (discrete)"| D([DMX Output])
Ideal node graph for pull evaluation strategy, and its execution order:
flowchart RL
C[DMX Realization] -->|"Fixture data (discrete)"| B
D([DMX Output]) -->|"Multi-universe DMX (discrete)"| C
B(Dope Sheet) -->|"Timecode (continuous)"| A(["Timecode source 1
<br/>At every tick returns a monotonic timestamp"])
Ideal node graph for push evaluation strategy, and its execution order (very painful and inefficient if this were pull):
flowchart LR
A(["Button press/OSC message/MIDI note"]) -->|"Event/trigger (discrete)"| B{Decision tree?
Arithmetics?}
B -->|"Event/trigger"| D([e.g. OSC output])
DMX Realization on the other hand sound's quite nice but would always be delayed by one "frame" as the dmx output is calculated at the end of each frame while the nodes are executed in the middle.
I'm suggesting the fixture data to DMX calculation is moved into the DMX Realization node, and calculated in sync with the rest of the node system. Though I haven't taken a closer look at the node execution engine yet.
A DMX Output node already exists but it only accepts a single channel for a single universe.
I didn't see that earlier. Here
I'll actually just give you my old notes:
Click to expand
Events/Triggers (finite, discrete)
- Button presses
- OSC messages
- MIDI notes
- Scene changes
Need to immediately recompute affected parts of the graph
Timecode/Clock (infinite, continuous)
-
A monotonic timestamp
-
Drives time-based transitions
-
Doesn't change state, but is sampled regularly (like polling)
-
Enables deterministic forward computation ("what will the DMX state be at t + 50ms")
Push vs Pull: The Evaluation Strategy
Pull (lazy):
- Nodes only compute when asked
- Downstream node pulls input from upstream -> upstream evaluates only if needed
- Good for deterministic, efficient graph traversal
- Fits well with "scheduler ticks"
Push (eager):
- Nodes push updates downstream when their inputs change
- Reactive: Every change ripples through the graph
- Good for handling fast-changing event streams
- Harder to order and optimize
Execution
On input event (e.g. MIDI CC, OSC, UI click)
- Set the node's value
- Recurse downstream through all connected nodes that do not depend on time.
- For each of those nodes:
- Immediately compute its new value
- Push any outputs (e.g. update UI, send MIDI/OSC out)
On engine tick (e.g. 100Hz):
- Update the current timecode
- Apply debounced events (e.g. MIDI fader, external timecode), and process them as if they were normal input events (see above).
- Mark all time-dependent nodes as dirty.
- For each output node (e.g. DMX):
- Compute its value using a pull strategy, which recursively evaluates any dirty
upstream nodes. - During this process, each nodes clears its
dirtyflag after being computed. - Final output is flushed to its respective interface (e.g. sent to DMX universe)
- Compute its value using a pull strategy, which recursively evaluates any dirty
I might want to throw audio and video into the continuous box too, because I probably don't want to recompute video stuff between frames when I move a slider or whatever. Or this could be detected like does the video mixer node lead to any push-triggered outputs.
^^^ An addition to the above comment:
GStreamer supports both push and pull modes, chooses them automatically (e.g. is source node live, is sink node clocked externally, etc.) and makes buffers and stuff automatically when push and pull subgraphs are connected. IIRC it also supports different tick rates between subgraphs.
Okay, I understand your request now. The execution model of Mizer is different though and currently I would like it to stay this way. In general my philosophy has changed a bit. I want to move features out of the node pipeline into prebuilt blocks that can be enhanced by nodes so it will be easier to built a show as it feels very tedious currently. Having to manually build the pipeline to write fixture values to the dmx outputs goes against this goal.
Ideally all the tools which will help you in the programming workflow would be an addon/plugin/module that sits on top of a very flexible node graph but as I already have major scope creep problems with this project I think this won't happen for now.
The pull vs push evaluation though is something I hadn't really thought about in this iteration. In an earlier version of this software (without the node graph though) I actually had everything in push mode as it was entirely event driven. So the push of a button or a clock tick or a fade in a sequence all emitted events. This was nice for response time and easy to build initially but it was very hard to reach a reliable application and to debug edge cases as the execution order would always be different.
But as optimization of the node graph is still on my roadmap I maybe have to re-evaluate this.