MediaPipeUnityPlugin icon indicating copy to clipboard operation
MediaPipeUnityPlugin copied to clipboard

Input from ARCamara to Holistic Scene

Open KiranJodhani opened this issue 2 years ago • 8 comments

Feature Description

@homuler I have been using this plugin from last 6 months and I found it very useful. It works great so far. Now I have specific requirement where I need to use both holistic and ARFoundation. I am planning to use face tracking using ARKit and rest of the body using holistic. To do so I created a sample scene with basic stuff and created script to listen for frame

`public void Start() { arCameraManager.frameReceived += OnARCameraFrameReceived; }

private void OnARCameraFrameReceived(ARCameraFrameEventArgs eventArgs) { arTextureToUpdate = GetCurrentColorTexture(); OutputImage.texture = arTextureToUpdate; ApplyNewTexture(arTextureToUpdate); }

unsafe Texture2D GetCurrentColorTexture() { if (!arCameraManager.TryAcquireLatestCpuImage(out XRCpuImage image)) return null;

XRCpuImage.ConversionParams conversionParams = new XRCpuImage.ConversionParams { inputRect = new RectInt(0, 0, image.width, image.height), outputDimensions = new Vector2Int(image.width, image.height), outputFormat = TextureFormat.RGBA32, transformation = XRCpuImage.Transformation.MirrorX };

int size = image.GetConvertedDataSize(conversionParams);
NativeArray<byte> buffer = new NativeArray<byte>(size, Allocator.Temp);

image.Convert(conversionParams, new IntPtr(buffer.GetUnsafePtr()), buffer.Length);
image.Dispose();

if (arTexture == null)
{
  arTexture = new Texture2D(conversionParams.outputDimensions.x, conversionParams.outputDimensions.y, conversionParams.outputFormat, false);
}

arTexture.LoadRawTextureData(buffer);
arTexture.Apply();
buffer.Dispose();
return arTexture;

}`

Here I can see frame is being received and applied to sample raw image in UI. Now I thought of two approach

1 - Applying generated raw image directly in StaticImageSource and select Image in Bootstrap. While implementing this I found changing image in Texture[] _availableSources does not actually apply because i think it initialise once only. To understand this better I looked more and found that new texture should be applied to _outputTexture also which was not applying when I change texture(either directly from inspector or from code). This is because InitializeOutputTexture was called once only. When I tried to make this call on OnARCameraFrameReceived I could see it was working,however RawImage in Screen.cs was not updating so I took reference of Screen.cs and applied new texture there. It was then working. I checked with couple of images and tried to change using code and found it working in editor. Then I created build for android and checked. After running around 5 seconds the app was crashing. I suspect it's because of memory consuption. I was wondeing to know whether this is safe approach to use Image generated from ARCamara as input for mediapipe.

2 - In your comment above you mentioned if I have Texture2D at hand I can use it like this

// ReadFromImageSource(imageSource, textureFrame);

// Texture2D texture2d; textureFrame.ReadTextureFromOnCPU(texture2d);

Here I am little confused about type of Image source. What should be selected here as input? WebCamara or Image? If i select image then point 1 above will apply and selecting WebCamara will conflict with ARCamara

Current Behaviour/State

No response

Additional Context

No response

KiranJodhani avatar Nov 03 '23 11:11 KiranJodhani

Could you please describe the problem more simply? For example, while you have mentioned something that is working, it's unclear how they relate to the overall issue. Consequently, I don't understand what you really want to know.

Now I will answer the questions based on what I have understood.

TL; DR

You should not use ImageSource classes. You should build an ImageFramePacket by yourself and send it to the CalculatorGraph (cf. https://github.com/homuler/MediaPipeUnityPlugin/issues/343#issuecomment-1096698719 )

Details

First, please note that source files under Assets are for the sample app. You may use them but they are not intended to be used out of this project. So you don't need to use ImageSource classes (e.g. StaticImageSource) to run the CalculatorGraph in your project.

To run the CalculatorGraph, you need to generate an input data (= ImageFramePacket) (cf. https://github.com/homuler/MediaPipeUnityPlugin/wiki/Getting-Started#send-imageframe ). Once you've done that, there's no need to use ImageSource classes since the input data is already available. Please read https://github.com/homuler/MediaPipeUnityPlugin/issues/343#issuecomment-1096698719 to see how to do that.

Here I am little confused about type of Image source. What should be selected here as input? WebCamara or Image?

In the sample app, ImageSource class is a source of input images. For example, StaticImageSource can be used as the Texture generator of a static image and WebCamSource is a wrapper class of WebCamTexture. However, there's no ImageSource class which returns the ARCamera image, which means there's no proper ImageSource that can be used in your case.

homuler avatar Nov 03 '23 12:11 homuler

Now if I pass this texture 2d to HolisticTrackingGraph will this work?

The answer is the followings:

To run the CalculatorGraph, you need to generate an input data (= ImageFramePacket) (cf. https://github.com/homuler/MediaPipeUnityPlugin/wiki/Getting-Started#send-imageframe ). Once you've done that, there's no need to use ImageSource classes since the input data is already available. Please read https://github.com/homuler/MediaPipeUnityPlugin/issues/343#issuecomment-1096698719 to see how to do that.

Like what you said here https://github.com/homuler/MediaPipeUnityPlugin/issues/343#issuecomment-962349426

I said that "please read https://github.com/homuler/MediaPipeUnityPlugin/issues/343#issuecomment-1096698719", not https://github.com/homuler/MediaPipeUnityPlugin/issues/343#issuecomment-962349426.

Please read the implementation of the OnCameraFrameReceived method.

homuler avatar Nov 03 '23 12:11 homuler

See https://github.com/homuler/MediaPipeUnityPlugin/issues/343#issuecomment-1773003981.

homuler avatar Nov 05 '23 11:11 homuler

MediaPipeException: NOT_FOUND: ; Unable to attach observer to output stream "face_detections" because it doesn't exist.

Isn't this error message explaining the cause sufficiently? See also HolisticTrackingGraph. https://github.com/homuler/MediaPipeUnityPlugin/blob/2d2863ea740a6a5ad01854ea88ab5f48be2a36b6/Assets/MediaPipeUnity/Samples/Scenes/Holistic/HolisticTrackingGraph.cs#L220-L227

If you've not finished the tutorial, I strongly recommend you finish it first.

homuler avatar Nov 08 '23 01:11 homuler