smithay
smithay copied to clipboard
WGPU backend support
Hi everyone! What are your thoughts on implementing a WGPU graphics backend? This seems related to #129 and #134. I'm wondering if there would be any benefit to using WGPU over gfx-hal as WGPU has a (mostly) safe api.
I've been experimenting on my own with WGPU and DRM and have gotten EGL rendering but I still have yet to implement Vulkan.
Let me know what you think!
Hi everyone! What are your thoughts on implementing a WGPU graphics backend? This seems related to #129 and #134. I'm wondering if there would be any benefit to using WGPU over gfx-hal as WGPU has a (mostly) safe api.
wgpu is indeed a safer API and hides a lot of nasty things like memory barriers. That is a plus over gfx-hal.
However there are some parts of a Wayland compositor that wgpu arguably does not go low enough level to let us access.
Here are some things wgpu does not address for a Wayland compositor (this list is not exhaustive):
- Lacks the APIs for colorspace management, ala color representation protocol proposal (https://gitlab.freedesktop.org/wayland/wayland-protocols/-/merge_requests/183).
- Cannot support YUV buffers (actually this might change, but it will only work for importing YUV formatted images https://github.com/gfx-rs/wgpu/issues/3145)
- wgpu does not natively support many features with the high level API (you'll need to manually do many of these things via the wgpu-hal API) including but not limited to:
- Importing an texture from a dmabuf.
- wgpu cannot allocate it's own buffers. This means the
Offscreen
trait is off limits. - wgpu does not allocate it's fences or whatever sync is internally to be exported as a sync fd (this is planned for the existing renderers)
- Vulkan requires the extensions that are needed to be declared when creating an Instance and Device. This means you need to use wgpu-hal to initialize Vulkan.
Essentially wgpu lacks what is needed to be a first class renderer that I'd consider suitable for inclusion into Smithay.
However this does not mean wgpu is unusable.
It should be entirely possible to use wgpu as a guest renderer ontop of the gles2 (you'll need to ensure gles 3.0 is available because wgpu requires gles 3.0) or the future vulkan renderer. This would involve using the wgpu-hal APIs to create an Instance and Device while telling wgpu to not take ownership of the instance and device (which smithay would own).
I've been experimenting on my own with WGPU and DRM and have gotten EGL rendering but I still have yet to implement Vulkan.
I do have some pointers on that. In fact I have made a (hacky, it works on amdgpu with a 6700xt but stuff is hardcoded so it may just spit out errors on your system) example of how you'd render to a buffer wgpu imports via it's vulkan hal types and then scan it out: https://gist.github.com/i509VCB/649d3b62f23a458ef18d3f2f4731eca7
I do have a plan to adjust the VulkanAllocator to allow importing dmabufs so that if you truly wanted to do that you'd have smithay deal with all the mess of importing the dmabuf and give you a VkImage.
This does require using wgpu-hal, but it is a use case I do want to support. I am also on the smithay and wgpu matrix if you have questions about that (see the badges in the README).
Additionally, there is only one real learning resource for wgpu and it's pretty low quality. Another problem is that WebGPU standard is not stable and WGSL requires custom pre-processor to implement fake import directives and many other features present in GLSL and HLSL.
There is already discussion about wgpu, see discussions section.
Previous discussions on that subject for reference:
- https://github.com/Smithay/smithay/discussions/431
- https://github.com/Smithay/smithay/issues/134
- https://github.com/Smithay/smithay/issues/129