BabylonNative
BabylonNative copied to clipboard
Add support for AR occlusion
ARKit, ARCore, and OpenXR all support some kind of occlusion. See https://arstechnica.com/gadgets/2020/06/google-gives-android-depth-sensing-and-object-occlusion-with-arcore-1-18/ for some recent news about ARCore occlusion, and https://developer.apple.com/videos/play/wwdc2020/10612 (~7:30 mark) for some recent news about ARKit occlusion.
We should try to support this in some way. As far as I can tell, WebXR does not have any kind of occlusion support today, so we'll probably have to make something up that will hopefully be compatible with a future WebXR occlusion module. There is a small bit of info on this in the WebXR AR module spec:
The XR Compositor MAY make additional color or pixel adjustments to optimize the experience. The timing of composition MUST NOT depend on the blend technique or source of the real-world environment. but MUST NOT perform occlusion based on pixel depth relative to real-world geometry; only rendered content MUST be composed on top of the real-world background.
NOTE: Future modules may enable automatic or manual pixel occlusion with the real-world environment.
See https://www.w3.org/TR/webxr-ar-module-1/#xr-compositor-behaviors
android Arcore have Depth Api. here is the doc ios Arkit also have an Api and for web. we can use Deep learning or maybe do it the way Arcore does
For iOS, besides people occlusion, reconstructed environment mesh can also be used to occlude at least with lidar capable devices: https://developer.apple.com/documentation/arkit/content_anchors/visualizing_and_interacting_with_a_reconstructed_scene
WebXR has a depth-sensing API proposal that is already working on android devices:
https://github.com/immersive-web/depth-sensing
Is depth occlusion supported now in Babylonjs?
Not yet. This issue is still open. If anyone wants to try to implement, let us know. :-)