gfx
gfx copied to clipboard
Support for OpenXR
OpenXR is pretty new, but there are basic bindings for it in rust already https://docs.rs/openxr. Eventually, supporting this would enable applications that use gfx-hal
directly or libraries that use gfx-hal
, like wgpu
, to transparently support most vr and ar headsets on most architectures.
I'm proposing that support gets added to gfx-hal. I can contribute to this, but I don't know where to start.
OpenXR also adds support for grabbing events from peripherals specific to XR environments. That might have to be supported in winit maybe?
I'm beginning to experiment with OpenXR integration on my own and curious to see where this goes. I'm happy to help as well but by far not an expert with gfx or graphics in general and don't have a solid grasp on all the concepts to propose a clean abstraction between the two.
I think there would need to be some clear definition of what support would mean. It could range from providing structures and functions bridge the gap and leave the rest to the user all the way to embedding OpenXR in gfx-hal so that things like the swapchain creation and management are pure gfx-hal and it doesn't appear to the user as an external API.
Curious to get input from the gfx maintainers on this and see if there has been any other efforts in to this.
Yes, I think making it mostly transparent for the user is the right way to go. OpenXR will take some vulkan or opengl instance data and present you will swapchains to render onto. Hiding all the configuration will be complex, as there's a lot of various vr-specific things, like positioning in space and whatnot that probably should not be hidden.
Maybe it'd be a good idea to port WebXR over to native as a wrapper over openxr, just as WebGPU has been ported to native.
Servo has developed a webxr crate with an openxr backend, if that helps at all: https://github.com/servo/webxr/
The OpenXR backend in webxr
hard-codes D3D11
as the graphics backend, so that would need to be generalised before it could be used here.
In the meantime, I've been poking around at the Vulkan backend using the openxr
Vulkan example as a guide, and it seems like the integration will need to be quite closely-linked. OpenXR runtimes using the XR_KHR_vulkan_enable
extension may specify Vulkan instance and/or device extensions that need to be enabled, and the upcoming XR_KHR_vulkan_enable2
extension provides its own wrappers around vkCreateInstance
and vkCreateDevice
(presumably to move the extension configuration effort away from the application onto the runtime).
Thus I think a good starting question is how to structure this integration. Some possibilities:
- An extension to the existing backend system
- e.g. have backends implement
Instance::create_with_xr(.., Option<hal::xr::System>)
and similar methods, makingInstance::create
a wrapper aroundInstance_create_with_xr(.., None)
. - This would enable the easiest code reuse, as backends would be supporting XR inline.
- User needs to know to use the specific methods and when to use them. It might be easier for users to get into invalid states.
- e.g. have backends implement
- A wrapper around the backend system
- e.g. have a
gfx-backend-openxr
crate that maybe depends ongfx-backend-vulkan
for common types. - Types here would be something like
Instance<OpenXr<Vulkan>>
andPhysicalDevice<OpenXr<Vulkan>>
. - This would end up duplicating most of the initialization and runtime logic (though some of it could be refactored into helper methods on
gfx-backend-vulkan
etc.) and thus might be more work to maintain?
- e.g. have a
- A parallel trait
XrInstance<B: Backend> : Instance<B>
- This might mesh more easily with WebXR.
- Helps to keep XR-specific methods out of the way when not being used by applications.
- Probably requires other parallel traits as well (
XrPhysicalDevice
?)
It seems to me that Instance::create
could just get another argument, like in your (1) suggestion.
Is there any support for OpenXR on Metal and D3D12?
There's a D3D12 extension that is supported on the HoloLens 2 AFAIK. I am not aware of Metal support.
OpenXR has extensions for Vulkan, OpenGL, OpenGL ES, D3D11, and D3D12. It might be possible to use https://github.com/KhronosGroup/MoltenVK as a pathway to XR support on macOS.
@str4d err... there is nothing for us in MoltenVK. It's just a library. If there is no way to work with OpenXR on Metal, then it's not possible with MoltenVK either.
@kvark Ah, okay. I had interpreted it as providing a Vulkan runtime, and thus maybe being usable with gfx-backend-vulkan
, but I've not used it (I'm not a macOS dev, or a graphics or VR dev for that matter - just a motivated Rust dev with some VR headsets who wants to be able to create simple VR apps in Rust 😄)
Hi, friendly neighborhood rust beginner here. Any movement in this area since October? This is where my optimistic quest for doing VR in Rust has lead me - I'm assuming its impossible to do VR with gfx-rs without this feature? Or is there some hacky workaround I could do that you know of? Maybe like rendering to a framebuffer and then giving that to openxr myself?
I have an early-stage attempt at this locally, which is currently blocked on me learning more about how graphics pipelines work. I'm planning this week to clean up the commits a bit and then open a draft PR so others can give feedback on the direction.
Maybe a more loosely coupled approach should be used. OpenXR integration could be achieved using raw handles interop as discussed in #3698. @blaind
Maybe a more loosely coupled approach should be used. OpenXR integration could be achieved using raw handles interop as discussed in #3698. @blaind
(As a note, some continued discussion related to looser coupling going on also at https://github.com/blaind/xrbevy/issues/1)
This is one take at what's needed for XR rendering example between an app, openxrs and gfx crates. Note the raw handles usage in GFX, which are not included it in any of the PR's above. This is not a suggestion for implementation, but rather depicts the dependencies and required flow between components.
Flow modeled after openxrs crate, OpenXR spec for vulkan2 initialization flow can be found at https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html#XR_KHR_vulkan_enable2
Some initial thoughts about the requirements for each component in the above sequence diagram.
Needed by App | Needed by WGPU | Needed by GFX | Needed by abstraction | Notes | |
---|---|---|---|---|---|
xr::Entry | :heavy_check_mark: | :o: | :o: | :white_circle: | Can maybe go to abstraction instead of an app? Custom headset loaders, etc. |
xr::Instance | :heavy_check_mark: | :o: | :o: | :heavy_check_mark: | Most probably needed by app for controller configuration and XR state for control loop |
gfx::Instance | :o: | :heavy_check_mark: | :heavy_check_mark: | :white_check_mark: | Needed for initialization, presence until end of app lifetime but no direct need (method calls)? |
gfx::Device | :white_check_mark: | :heavy_check_mark: | :heavy_check_mark: | :white_check_mark: | Fences are needed in render loop |
xr::Session | :heavy_check_mark: | :o: | :o: | :heavy_check_mark: | App needs for render loop, controller state, etc. |
xr::Swapchain | :grey_question: | :white_check_mark: | :o: | :white_check_mark: | Can XR swapchain be abstracted to somewhere? |
gfx::Image | :grey_question: | :white_check_mark: | (source) | :white_check_mark: | xrCreateSwapchain returns images by xrEnumerateSwapchainImages |
- :heavy_check_mark: = yes
- :white_check_mark: = maybe
- :grey_question: = maybe not
- :o: = not needed
- :white_circle: = optional