Denoiser
Denoiser copied to clipboard
Determine Batch Size
Right now the default batch size is 4 with the isMoible always defaulting to 1.
This is massive simplification and we should adjust what the batch size needs to be and test at what level it matters. Tensorflow hits me with warnings when I hit 1.7gb of memory data (which is huge) and I'm not sure what will happen when I try with WebGPU.
I think a goal would be to work at the rate before tensorflow complains. But this will take some testing between image size, Aux inputs, quality, and batch size.
Setting the batch size to 6 or 8 would significantly speed up processing but causes momentary spikes in memory as the model runs as it maintains the tensors throughout the uNet process.