NeuralAmpModelerCore icon indicating copy to clipboard operation
NeuralAmpModelerCore copied to clipboard

[FEATURE] Sequential model

Open sdatkinson opened this issue 8 months ago • 1 comments

TODO checklist

  • [ ] Factory to initialize from .nam file.
  • [ ] Document .nam file version 0.5.5 to include "sequential".
    • [ ] No "weights" nor "sample_rate" fields at the top level--these are part of the layers (and sample rates are asserted to be compatible)

Description

Motivation: Pocket Master can't run an IR at the same time as a NAM. This means that full rig NAMs are needed. Unfortunately, if you have a NAM that you usually use in serial with an IR, you'd have to somehow combine those. Re-training is one option, but I don't like that (1) it's lossy, and (2) it's a waste of GPU.

Solution: Define a new kind of .nam that's a serial concatenation of other NAMs.

E.g. one could combine a NAM of an amp with a linear NAM (aka an IR) of a cab to get a full rig NAM.

Obviously this doesn't do anything to the weights of either or magically make the resulting NAM take less CPU to run, or make it usable on hardware that only support the model architectures as of v0.2.0--it's just a convenience wrapper that might make it easier to get full rig NAMs, but might make it easier to make models that combine multiple sub-models.

Named "sequential" in reference to torch.nn.Sequential, which does a similar compositional thing.

sdatkinson avatar Apr 21 '25 23:04 sdatkinson

I would really prefer to just do this by implementing a general "(directed, acyclic) graph NAM" compositional model. However, I'm a bit wary about doing this without thinking through how to handle "fan-out"/"fan-in" topologies more carefully.

Perhaps "fan-out" can assume that the output is simply copied to each; and fan-in can define some "aggregation" options--mean and sum are the most obviously-useful, but others could be done and I don't want to rule those out. A "multi-input, multi-output" model definition might make more sense (#148).

But perhaps this can be the first step, then it's not too much to maintain some "converter" from any sequential model to its DAG representation in the future.

sdatkinson avatar Jul 27 '25 23:07 sdatkinson