WebGPU rendering backend
This would be interesting. I have no idea how to webgpu, if you are reading this and know and want to help reply?
i kinda know some webgpu (i did some stuff in rust) but i also know some vulkan stuff and graphics in general this would be super interesting indeed
im not sure how other browsers do rendering though. do they have a fragment shader for css's linear-gradient or something? do they just render quads with textures generated on the cpu? compute? no idea
maybe someone more knowledgeable about this than me can reply and clarify this stuff :)
Interesting post about rendering UIs: https://zed.dev/blog/videogame
We could basically reimplement the canvas 2d API via webgpu (any gain from this?) or create our own API using a render queue/etc?
i dont think reimplementing the canvas api would give any gains, it might even be slower.
a completely custom renderer is the way to go i think. i'll take a look at how ff does it today probably.
servo's webrender looks promising, although it looks like its using opengl instead of wpgu, but that shouldn't be much of a problem. im going to try to understand how it works in the next couple of days.