Best way to implement feedback buffer
First off, thanks for creating ComputeSharp and making it available!
I've been messing around with the SwapChainAppliation examples, and I'd like to implement something like this ShaderToy reaction-diffusion shader:
https://www.shadertoy.com/view/XsG3z1
Any suggestions on the best way to modify the SwapChainAppliation framework to allow for this type of shader? I'm assuming I'd need another ReadWriteTexture2D buffer the size of the viewport? Or two - read from one and write to the other and then swap them each frame?
Thanks!
Just tried to do this in the WinUI 3 branch with the IShaderRunner to facilitate creating a ReadWriteTexture2D to copy the result in each frame and also pass to the shader to run on in-between...
Ran into some interface vs. concrete type issues though with being able to easily copy the texture back into the buffer. Think an extra ComputeShader could do the copy, but that's more work... 🤣
I know @Sergio0694's working on aligning all his branches, so hopefully this feedback can help align to a shader runner to be shared across all the runtimes or something... 🤷♂️
I'll keep tinkering and share if I get anything working.
Ah, figured I'd at least try the ComputeShader approach, but feel like I'm missing something with how those work still in this context... almost got something though:

Didn't cover the whole image and shouldn't be moving pixels down though...
Ah, also thinking I'm running into issues with the texture size changing dynamically in the sample app and how to handle that with the buffer as I only allocate its size on the first frame... Not sure how to handle that at all in this scenario...
My broken experiments are here: https://github.com/hawkerm/ComputeSharp/tree/feature/winui3-extra-samples
// Copy result into back buffer.
backBuffer.CopyFrom(texture); // Want
Gpu.Default.For(texture.Width, texture.Height, new BufferCopier(texture, backBuffer)); // Used
Would be nice to be able to just copy between the buffers though with the IShaderRunner setup, but that doesn't account for the texture changing sizes with dynamic resolution, at least not with how the copy methods work today as I think they'd expect the same size textures...
I suppose we could use UV coordinates for the buffer in the Shader, and then if the texture size != back buffer size re-allocate the Back Buffer, and then use the ComputeShader to map the UVs from the current frame texture when writing back to the Back Buffer???
Well, with my brainstorm I tried it out... and it kind of worked:

Not sure why it's constantly falling and pushing pixels down as I haven't tried to program movement yet... also going to the top is causing permanent streaks... (Edit: Ah, think the streaks are a side-effect of my currently bad decay equation I had as a place-holder, and so since it's pulling down if the top pixel is filled it'll constantly perpetuate that value.)