mediapipe icon indicating copy to clipboard operation
mediapipe copied to clipboard

Android - gesture recognizer and webRTC on same camera

Open linlumio opened this issue 2 years ago • 3 comments

Gestures and WebRTC Hi everyone, I am implementing an application that can make video calls over WebRTC and recognise gestures to launch 'shortcuts'. The problem is that I can't manage the device camera correctly for both streams: When I start gesture recognition, the WebRTC video stops and vice versa, so they cannot work simultaneously. One idea I had was to manage the streams in a similar way to this: https://developer.android.com/training/camera2/multiple-camera-streams-simultaneously#multiple-targets Adding a target to the capture request, but I can't figure out how to pass a surface instead of calling cameraProvider.bindToLifecycle. How can I pass the video stream to the analyser? Is there a better way?

linlumio avatar Sep 22 '23 08:09 linlumio

@jenperson @schmidt-sebastian Hey, any thoughts on this by chance?

PaulTR avatar Sep 29 '23 14:09 PaulTR

Would it be possible to only send the webcam stream to MP Tasks, and then use the MPImage that we return to visualize the stream? This might add a bit of latency, but it's at least worth prototyping.

schmidt-sebastian avatar Sep 29 '23 19:09 schmidt-sebastian

I made it! But in the opposite way: Added a listener to the SurfaceView (SurfaceViewRenderer) of webRTC and calling the recognizeAsync every frame. It gives me quite good response on onResult and the cpu is not boiling.

linlumio avatar Nov 07 '23 14:11 linlumio