azure-kinect-unreal icon indicating copy to clipboard operation
azure-kinect-unreal copied to clipboard

Is it possible to expose the different camera feeds from the Kinect as an input in a blueprint?

Open torinmb opened this issue 4 years ago • 4 comments

I need to be able to record and save the video input from the Kinect while the pose tracking is running. Unfortunately once Unreal is running the pose tracking no other application can get the Kinect input, so I'm wondering if there's any way to expose the video in a blueprint.

Any help would be greatly appreciated! Thanks!

torinmb avatar Jun 30 '20 17:06 torinmb

^ Just wanted to bump this Azure provides the command-line tool k4arecorder for recording, but again it doesn't work when Unreal is using the Kinect. Maybe I could run a CMD script from Unreal to trigger k4arecorder? Even though the script would be run from Unreal it might try and open up another stream from the Kinect, which would not work. Still any advice on getting the different video feeds?

torinmb avatar Aug 04 '20 18:08 torinmb

Just duplicating my comment from #11, but we're planning on adding video feed support for sure!

dylandevs avatar Nov 12 '20 17:11 dylandevs

hi, you can save the image buffer and write it into a render target. The code below shows how to update the depth image into a render target resource. Make sure to include the RHI module in the plugin dependencies and you need to load a RT reference and store it on the same thread.

AzureKinectDevice::CaptureBodyTrackingFrame() after if (!NativeBodyTracker.enqueue_capture(sensorCapture, TimeOutInMilliSecsConverted)) block:

if (sensorCapture)
		{
			//we have the capture, cant send render command on this thread so send it to the game thread to render the buffer

			//bufferaddress = sensorCapture.get_depth_image().get_buffer();
			//buffersize = sensorCapture.get_depth_image().get_size();
			//auto bufferstride = sensorCapture.get_depth_image().get_stride_bytes();
			//UE_LOG(LogTemp, Warning, TEXT("Pointer: 0x%016x, Stride: %d "),bufferaddress, bufferstride);

		        UpdateTextureFence.BeginFence();
			AsyncTask(ENamedThreads::GameThread, [=]() {
				ENQUEUE_RENDER_COMMAND(FUpdateTexture)(
                           [this,sensorCapture, Resource2D = (FTextureRenderTarget2DResource*)DepthRT->Resource]
                           (FRHICommandListImmediate& RHICommandList)
                           {
                           uint32 DestStride;
                           void* RawData = RHICommandList.LockTexture2D(Resource2D->GetTextureRHI(), 0, RLM_WriteOnly, DestStride, false, false);
                           FFloat16* dst = (FFloat16*)RawData;
                           FMemory::Memcpy(dst, sensorCapture.get_depth_image().get_buffer(), sensorCapture.get_depth_image().get_size());
                           RHICommandList.UnlockTexture2D(Resource2D->GetTextureRHI(), 0, false, false);
                           });
				//DepthRT->UpdateResourceImmediate(false);

            });			
		}

Tokusei avatar Feb 14 '21 04:02 Tokusei

Amazing thanks will try this

Zyperworld avatar Feb 16 '21 12:02 Zyperworld