ProjectBabble
ProjectBabble copied to clipboard
(feat) Create per model/avatar/user calibration
Currently, the Babble App stores one calibration. This could be enhanced if calibrations could be stored per user AND avatar, with a default/fallback config per user.
These could then be loaded manually OR automatically when a user's avatar changes (say in VRChat via OSC). This could be represented by a tree dropdown selector, and be enabled/disabled like so:
...
- [ ] Automatically apply per-avatar configuration
- User 1
- Avatar A
- Avatar B
- ...
- Default
- User 2
- Avatar A
- Avatar B
- ...
- Default
...
With options to edit/delete entries as required.
Perhaps we can have a model "global" calibration like what we have now and have profile selectable "modifiers" that are applied on top of the global calibration. For example, there could be a second normalization someone can use to "boost" the output of a specific shape. Another example could be utilizing a curve for nonlinear shape activations.