arfoundation-samples icon indicating copy to clipboard operation
arfoundation-samples copied to clipboard

ARKit TrueDepth front facing camera depth map

Open sam598 opened this issue 4 years ago • 21 comments

ARKit provides access to the depth map from the TrueDepth camera. ARFoundation provides access to depth maps from the rear facing camera (if available) but not the front facing camera.

Currently the only way to access the TrueDepth camera's depth map in Unity is to modify the objective-c source of the deprecated ARKit Unity plugin. That plugin has also been deleted by Unity and is no longer accessible (see https://github.com/Unity-Technologies/arfoundation-samples/issues/190).

sam598 avatar Oct 06 '20 17:10 sam598

Currently the only way to access the TrueDepth camera's depth map in Unity is to modify the objective-c source of the deprecated ARKit Unity plugin. That plugin has also been deleted by Unity and is no longer accessible (see #190).

You can get ARKit's ARFrame pointer with ARFoundation. If you are willing to write Objective-C code to get at the ARFrame's capturedDepthData, then you should have no problem accessing it.

tdmowrer avatar Oct 06 '20 23:10 tdmowrer

@tdmowrer I posted an issue earlier this year trying to do something very similar: https://github.com/Unity-Technologies/arfoundation-samples/issues/517

The final question which was never solved was if there is any available documentation or code to convert CVPixelBufferRef to XRCpuImage? Or any suggestions on how to efficiently handle a CVPixelBuffer on the C# side of Unity?

sam598 avatar Oct 07 '20 00:10 sam598

You can get the CVPixelBuffer from the ARFrame. While XRCpuImage is an abstraction and, on iOS, talks to a CVPixelBuffer, you cannot create an XRCpuImage from a raw pointer to a CVPixelBuffer. Why do you need to?

tdmowrer avatar Oct 07 '20 01:10 tdmowrer

As I mentioned in the previous issue, it would be great to use a function like XRCpuImage.Transformation.Convert to keep the extension consistent with how AR Foundation works.

sam598 avatar Oct 07 '20 03:10 sam598

The bulk of the usefulness of the image converter is combining dual-channel video stream into an RGB image, which isn't relevant to depth data. What transformations to you want to apply?

tdmowrer avatar Oct 07 '20 22:10 tdmowrer

I'm looking to convert a CVPixelBuffer to a Unity Texture.

I could use a similar technique to the old ARKit for Unity plugin by converting it to a Metal texture and updating it through an external Unity texture. But it is not clear if this is the best method or if AR Foundation is using a more efficient technique.

And again, it would be nice to keep the extension consistent with how AR Foundation works.

sam598 avatar Oct 07 '20 23:10 sam598

I'm looking to convert a CVPixelBuffer to a Unity Texture.

That makes sense, but that is not what the XRCpuImage does.

I could use a similar technique to the old ARKit for Unity plugin by converting it to a Metal texture and updating it through an external Unity texture. But it is not clear if this is the best method or if AR Foundation is using a more efficient technique.

That is exactly what we do in ARFoundation.

And again, it would be nice to keep the extension consistent with how AR Foundation works.

I'm not sure what ARFoundation API you are expecting to be able to use. If your goal is to get a Texture2D from a pointer, Texture2D.CreateExternalTexture provides that functionality. It's not clear to me what functionality we are missing.

tdmowrer avatar Oct 08 '20 00:10 tdmowrer

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

stale[bot] avatar Oct 15 '20 03:10 stale[bot]

Thanks for confirming @tdmowrer. If I get around to making the extension I will post it here.

However with depth cameras becoming more and more prevalent on XR devices AR Foundation should probably have a proper unified way of accessing depth maps, especially since they are not always going to be "environment depth".

sam598 avatar Oct 15 '20 05:10 sam598

I would love this feature too - being able to get the front depth map as we can currently with the rear view would be very useful. I've given a try at working on some kind of plugin as @tdmowrer hinted, but I got lost converting the ARFrame depthdata into something that can be used within Unity. @sam598 did you manage to sort this issue?

momorprods avatar Nov 19 '20 13:11 momorprods

@momorprods it ain't pretty, but it works.

TrueDepthMap.mm

#import <ARKit/ARKit.h>

typedef struct UnityXRNativeSessionPtr
{
    int version;
    void* session;
} UnityXRNativeSessionPtr;

extern "C" struct UnityDepthTextureHandles
{
    void* textureDepth;
    double depthTimestamp;
    int width;
    int height;
};

static double s_DepthTimestamp = 0.0;
static id <MTLTexture> s_CapturedDepthImageTexture = NULL;
static int s_width = 0;
static int s_height = 0;

static id <MTLDevice> _device = NULL;

static CVMetalTextureCacheRef _textureCache;

extern "C"
{
    UnityDepthTextureHandles UnityGetDepthMap(UnityXRNativeSessionPtr* nativeSession)
    {
        if(_device == NULL)
        {
            _device = MTLCreateSystemDefaultDevice();
            CVMetalTextureCacheCreate(NULL, NULL, _device, NULL, &_textureCache);
        }

        ARSession* session = (__bridge ARSession*)nativeSession->session;
        ARFrame* frame = session.currentFrame;

        if (frame.capturedDepthData != NULL)
        {
            double newTimeStamp = frame.capturedDepthDataTimestamp;
            
            if(newTimeStamp > s_DepthTimestamp)
            {
                id<MTLTexture> textureDepth = nil;
                
                //NSLog(@"depth time is %.2f", s_DepthTimestamp);

                CVPixelBufferRef pixelBuffer = frame.capturedDepthData.depthDataMap;

                size_t depthWidth = CVPixelBufferGetWidth(pixelBuffer);
                size_t depthHeight = CVPixelBufferGetHeight(pixelBuffer);

                if(depthWidth != 0 && depthHeight != 0){
                    
                    MTLPixelFormat pixelFormat = MTLPixelFormatR32Float; // MTLPixelFormatR32Float MTLPixelFormatR16Float;
                    
                    
                    CVMetalTextureRef texture = NULL;
                    CVReturn status = CVMetalTextureCacheCreateTextureFromImage(NULL, _textureCache, pixelBuffer, NULL, pixelFormat, depthWidth, depthHeight, 0, &texture);
                    if(status == kCVReturnSuccess)
                    {
                        textureDepth = CVMetalTextureGetTexture(texture);
                        CFRelease(texture);
                    }
                    
                    if (textureDepth != nil) {
                        s_CapturedDepthImageTexture = textureDepth;
                        s_DepthTimestamp = newTimeStamp;
                        s_width = (int)depthWidth;
                        s_height = (int)depthHeight;
                    }
                }
            }
        }

        UnityDepthTextureHandles handles;

        handles.textureDepth = (__bridge_retained void*)s_CapturedDepthImageTexture;

        handles.depthTimestamp = s_DepthTimestamp;

        handles.width = s_width;
        handles.height = s_height;

        return handles;
    }

    void ReleaseDepthTextureHandles(UnityDepthTextureHandles handles)
    {
        if (handles.textureDepth != NULL)
        {
            CFRelease(handles.textureDepth);
        }
    }

    void UnityUnloadMetalCache()
    {
        if (_textureCache != NULL) {
            CFRelease(_textureCache);
            _textureCache = NULL;
        }
        _device = NULL;
    }
}

TrueDepthMap.cs

using System;
using System.Runtime.InteropServices;
using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.UI;

public class TrueDepthMap : MonoBehaviour
{
    public ARSession arSession;
    public ARCameraManager cameraManager;
    public RawImage depthDisplay;

    TrueDepthMapInterface trueDepthMapInterface;

    private double previousDepthTimestamp;

    private Texture2D _depthTexture;

    void OnEnable()
    {
        if (trueDepthMapInterface == null)
        {
            trueDepthMapInterface = new TrueDepthMapInterface();
        }
        if (cameraManager != null)
        {
            cameraManager.frameReceived += OnCameraFrameReceived;
        }
    }

    void OnDisable()
    {
        if (cameraManager != null)
        {
            cameraManager.frameReceived -= OnCameraFrameReceived;
        }
    }

    private void OnDestroy()
    {
        if (trueDepthMapInterface != null)
        {
            trueDepthMapInterface.UnloadMetalCache();
        }
    }

    void OnCameraFrameReceived(ARCameraFrameEventArgs eventArgs)
    {
        if (arSession.subsystem == null)
            return;

        DepthTextureHandles handles = trueDepthMapInterface.GetDepthMap(arSession.subsystem.nativePtr);

        if (handles.IsNull())
            return;

        if (handles.Width == 0 || handles.Height == 0)
            return;

        if (handles.DepthTimestamp > previousDepthTimestamp)
        {
            previousDepthTimestamp = handles.DepthTimestamp;

            if (_depthTexture != null)
            {
                if (handles.Width != _depthTexture.width || handles.Height != _depthTexture.height)
                {
                    Destroy(_depthTexture);
                    _depthTexture = null;
                }
            }

            if (_depthTexture == null)
            {
                _depthTexture = Texture2D.CreateExternalTexture(handles.Width, handles.Height, TextureFormat.RFloat, false, false, (System.IntPtr)handles.TextureDepth);

                _depthTexture.filterMode = FilterMode.Point;
                _depthTexture.wrapMode = TextureWrapMode.Repeat;

            }
            _depthTexture.UpdateExternalTexture(handles.TextureDepth);

            if(depthDisplay != null)
                depthDisplay.texture = _depthTexture;
        }
    }
}

public class TrueDepthMapInterface
{
    [DllImport("__Internal")]
    public static extern DepthTextureHandles.DepthTextureHandlesStruct UnityGetDepthMap(IntPtr ptr);
    [DllImport("__Internal")]
    public static extern void UnityUnloadMetalCache();
    [DllImport("__Internal")]
    public static extern void ReleaseDepthTextureHandles(DepthTextureHandles.DepthTextureHandlesStruct handles);

    public DepthTextureHandles GetDepthMap(IntPtr nativePtr)
    {
#if !UNITY_EDITOR && UNITY_IOS
        return new DepthTextureHandles(UnityGetDepthMap(nativePtr));
#else
	return new DepthTextureHandles(new DepthTextureHandles.DepthTextureHandlesStruct { textureDepth = IntPtr.Zero });
#endif
    }

    public void UnloadMetalCache()
    {
        UnityUnloadMetalCache();
    }
}

public class DepthTextureHandles
{
    public struct DepthTextureHandlesStruct
    {
        // Native (Metal) texture handles for the device camera buffer
        public IntPtr textureDepth;
        public double depthTimestamp;
        public int width;
        public int height;
    }

    private DepthTextureHandlesStruct m_DepthTextureHandlesStruct;
    public IntPtr TextureDepth
    {
        get { return m_DepthTextureHandlesStruct.textureDepth; }
    }

    public double DepthTimestamp
    {
        get { return m_DepthTextureHandlesStruct.depthTimestamp; }
    }

    public int Width
    {
        get { return m_DepthTextureHandlesStruct.width; }
    }

    public int Height
    {
        get { return m_DepthTextureHandlesStruct.height; }
    }

    public DepthTextureHandles(DepthTextureHandlesStruct arTextureHandlesStruct)
    {
        m_DepthTextureHandlesStruct = arTextureHandlesStruct;
    }

#if !UNITY_EDITOR && UNITY_IOS
    ~DepthTextureHandles()
    {
        TrueDepthMapInterface.ReleaseDepthTextureHandles(m_DepthTextureHandlesStruct);
    }
#endif
    public bool IsNull()
    {
        return (m_DepthTextureHandlesStruct.textureDepth == IntPtr.Zero);
    }


    // Disable the default and copy constructors because we are not currently tracking references of the Objective C handles in this case.
    private DepthTextureHandles()
    {
        // This
        Debug.Assert(false, "should not call the default constructor for DepthTextureHandles");
        m_DepthTextureHandlesStruct = new DepthTextureHandlesStruct { textureDepth = IntPtr.Zero };
    }

    private DepthTextureHandles(DepthTextureHandles arTextureHandles)
    {
        Debug.Assert(false, "should not call the copy constructor for DepthTextureHandles");
        m_DepthTextureHandlesStruct = new DepthTextureHandlesStruct { textureDepth = IntPtr.Zero };
    }

}

The original UnityARKitPlugin has some async methods I took out to try and make sure that the depth frame is converted as soon as Unity receives the color frame. Because this workaround is a strange ping-pong loop I'm sure the AR Foundation team could make a version that is more efficient and safe.

Keep in mind that current iOS devices run at about 60hz on the front facing color camera, while the true depth camera runs at about 15hz, so only 1 out of every 4 frames will have depth data.

sam598 avatar Nov 24 '20 03:11 sam598

OMG you made my day @sam598 👍 🥇

momorprods avatar Nov 24 '20 07:11 momorprods

Thank you so much @sam598 it works as expected!

Unity people, please do integrate this natively in Unity.

cecarlsen avatar Dec 16 '20 09:12 cecarlsen

@sam598 thank you for the great work!

@tdmowrer Would love to see this functionality integrated into unity as well!

sek19 avatar Mar 02 '21 18:03 sek19

Thanks for the script!

Unity, please expose depth information from the face camera as well! It's a shame ARKit doesn't do this and certain functionality is limited by-camera-basis, but would be a great feature in ARFoundation without using native code, especially if it's possible with certain ease.

lehtiniemi avatar Sep 14 '21 11:09 lehtiniemi

Unity, please please please integrate the depth map on the ARKit/iphone forward camera. It works for Android. I have an app that uses the Android depth map on forward camera, which is supported by AR Foundation. Seem logical to keep the platforms in sync and that it should be integrated for ARKit too. Don't want to have to wait until all my users iphones have lidar to implement the app. Many thx, KnewK

knewk avatar Oct 12 '21 22:10 knewk

Unity, please please please integrate the depth map on the ARKit/iphone forward camera. It works for Android. I have an app that uses the Android depth map on forward camera, which is supported by AR Foundation. Seem logical to keep the platforms in sync and that it should be integrated for ARKit too. Don't want to have to wait until all my users iphones have lidar to implement the app. Many thx, KnewK

@knewk May I ask for a sample of how you did that for the front facing camera on android? We are trying to accomplish the same

patrick508 avatar Nov 16 '21 14:11 patrick508

Thank you so much @sam598 it works as expected!

Unity people, please do integrate this natively in Unity.

Can you outline how you got this to work. Sample scene perhaps. I can't seem to get any depth image to display to a raw image. Set AR Session, AR Camera Manager and Raw Image components but no luck.

knewk avatar Apr 26 '22 03:04 knewk

Unity, please please please integrate the depth map on the ARKit/iphone forward camera. It works for Android. I have an app that uses the Android depth map on forward camera, which is supported by AR Foundation. Seem logical to keep the platforms in sync and that it should be integrated for ARKit too. Don't want to have to wait until all my users iphones have lidar to implement the app. Many thx, KnewK

@knewk May I ask for a sample of how you did that for the front facing camera on android? We are trying to accomplish the same

AR Core Depth Lab on github has been converted to AR Foundation using AR Core Extensions for AR Foundation. Instructions in the hub. https://github.com/googlesamples/arcore-depth-lab

knewk avatar Apr 26 '22 11:04 knewk

@sam598 @momorprods @cecarlsen @sek19

Can you please help me how to set this up in order to recieve depth data from true depth sensor?

  1. I created a new unity project
  2. Imported ARKit XR plugin
  3. pasted the TrueDepthMap.mm in Assets>Plugins>iOS
  4. created TrueDepthMap.cs and added it into scene.
  5. but when I create a built in Xcode I see nothing Only white screen, However it says camera in use.

Can you please provide a demo project. I need help getting real-time depth in unity

alithediscover avatar Mar 16 '23 11:03 alithediscover

@knewk @alithediscover did you ever manage to get TrueDepth from the user facing cam (using this method or another?).

I am also getting no data.

Marwan-imverse avatar Sep 28 '23 15:09 Marwan-imverse