serenity icon indicating copy to clipboard operation
serenity copied to clipboard

Piano: Transform Piano into a DAW/Tracker

Open kleinesfilmroellchen opened this issue 3 years ago • 6 comments

Make Piano into a DAW; a.k.a.: Kleines Filmröllchen's Serenity Masterplan

The goal for Piano, as discussed with @willmcpherson2 and other people, is to transform it into a DAW and tracker program similar in mode of operation to programs like Cubase or Ableton.

We want to have multiple channels, each channel may contain note tracks or raw audio (and multiple of each). Raw audio can be loaded from a file, recorded from audio input devices, or dubbed from other track(s). A similar thing goes for notes: Load MIDI files, live record from the keyboard or MIDI devices (will require MIDI drivers), export to MIDI.

To create audio from notes and to modify existing audio, there should be a processing chain consisting of any number of processors. A processor in turn can input notes or sound and outputs notes or sound. The most important kinds of processors that we will port from the existing Piano functionality are synthesizers which create sound from notes and effects/filters which process audio to e.g. add delay or reverb, EQ, or band-pass the signal. Through this flexible system, just a handful of small processors may be combined for a huge variety in sound and anybody can implement their favorite synth/filter easily.

Additionally, turning Piano into a proper application will require proper project load and save and the ability to copy/paste basically anything.

This is a very wide scope that will require a lot of work. ~~I won't put a to-do list here because~~ this is very much a draft of ideas and the scope may change as we move along. Here's a plan of sorts, though other things are mixed in.

Finished work

  • Start LibDSP (#9638 )
  • Transport controls (#8634 )
  • Move synth engine & mastering into LibDSP (#10167 (atomicised, therefore closed), #10726 )
  • Make audio real-time capable (#12102 )
  • Make samples 32-bit float (#13938 )
  • refactorings from when I started all of this (#14010 ) and move to the significantly more complex sample range APIs
  • Move Keyboard into LibDSP (#14032 )
  • Move Track into LibDSP (#14424 )
  • Make volume work again + major UI code cleanup (#14664 )

Ongoing work

(your PR here!)

Planned next steps

  • Move TrackManager into LibDSP
  • Remove Piano's Music.h almost entirely, except for UI-specific code (such as key&waveform colors)
  • Rewrite AudioPlayerLoop as real-time code & enforce allocation guards

kleinesfilmroellchen avatar Apr 21 '21 09:04 kleinesfilmroellchen

One basic design idea is to have a Processor virtual class with some virtual methods like:

Audio thru(Audio in)
MIDI thru(MIDI in)
Audio thru(MIDI in)
MIDI thru(Audio in)

In other words, a processor object will implement a thru method with input as either Audio or MIDI and output as either Audio or MIDI. Note that Audio will probably be a sample and MIDI will probably be the current notes. Here are some examples of processor instances:

  • Reverb is Audio thru(Audio in)
  • An arpeggiator is MIDI thru(MIDI in)
  • Autotune could be MIDI thru(Audio in) (this combination is the most rare)
  • A synthesiser is Audio thru(MIDI in)

Then there would be some sort of Chain type, or just a Vector<Processor>. There would be some runtime check that a signal can actually be threaded through the thru methods. For example Ableton Live simply prevents you from inserting an incompatible processor.

Processor could also have a draw method etc.

willmcpherson2 avatar Apr 21 '21 10:04 willmcpherson2

Two main things that are outside of the scope of this project would help us:

  • [ ] MIDI drivers and MIDI subsystem (Kernel). This will allow us to capture and record input from actual MIDI devices, and even send signals back. I'm not referring to the MIDI file format here.
  • [ ] KnobWidget (LibGUI). A lot of processor dials would benefit from being an actual skeuomorphic knob instead of the current slider. Essentially, a KnobWidget is just a visually distinct slider widget with the ability to add "guide notches" (e.g. the center point for pan knobs, the zero point for EQ knobs) or even tiny text labels (volume 1-10 like on amps, pan L/C/R, effect Dry/Wet).

kleinesfilmroellchen avatar Apr 21 '21 11:04 kleinesfilmroellchen

My 2 cents:

A Processor should also expose some methods to modify its internal params - to twist its knobs programmatically. This will enable automation and MIDI CC integration down the line. Now, these params should also be represented as objects wrapping a float or whatever because they can have medadata such as min, max values etc. Also, there are at least two types of them: audio rate params which can vary from sample to sample, and control rate params that don't have to vary 44k times a sec. The DSP code inside a Processor should be able to discern between those two for performance reasons.

This system could be even moved to its own lib, i.e. LibDSP or LibSynth because it can be useful for other apps - media players, games, Browser's WebAudio support etc. We'd have our own PD/Max analog.

Plus, we'd have a Track that has:

  • a Processor chain
  • list of Clips to be played at a given time
  • input/output gain & pan + other track related state

Track could be a MIDITrack and AudioTrack having either MIDIClips or AudioClips. I think MIDI is enough for now.

It might be useful to have a special Track for master (or main) that is a return bus and doesn't have any Clips. This could be also used for sends later on. This would require multiple Audio in on a track, or better, the Audio thing to sum any number of things plugged into it.

We'd also need to have a separate, global Transport object that does the playback control and timekeeping. It would know the BPM, measure, track playheads across all the clips at all times and provide the time to all processors and audio out. In general, this is more complex than it may sound so it's good to keep this logic contained from the start.

About MIDI, I think that Piano can be developed without any real MIDI drivers and support from the OS but the music model should at least resemble MIDI spec a little bit. This will make reading and writing MIDI files easy and if done correctly, should enable easy integration with the future MIDI subsystem if there is one.

nooga avatar Apr 21 '21 12:04 nooga

For completeness sake, kling suggested a node-based fx system many years ago...

https://freenode.logbot.info/serenityos/20191206#c2938796

https://freenode.logbot.info/serenityos/20200210#c3219213

Simply converting Piano to use a proper object model where processors can be chained would be a major improvement to the application. In fact I would suggest that the processor chain should be the only task within the scope of this issue. I agree with essentially everything in the thread currently, but let's not get too designy.

Also +1 for LibDSP, that just seems very reasonable.

willmcpherson2 avatar Apr 21 '21 15:04 willmcpherson2

WIP: https://github.com/kleinesfilmroellchen/serenity

willmcpherson2 avatar Apr 26 '21 08:04 willmcpherson2

#6258

Thiri25 avatar Sep 07 '21 00:09 Thiri25

I will close this since it provides not much benefit to us as a tracking issue. I will continue to transform Piano into a DAW of course :^)

kleinesfilmroellchen avatar Feb 27 '23 15:02 kleinesfilmroellchen