mediapipe
mediapipe copied to clipboard
Android - gesture recognizer and webRTC on same camera
Gestures and WebRTC
Hi everyone,
I am implementing an application that can make video calls over WebRTC and recognise gestures to launch 'shortcuts'.
The problem is that I can't manage the device camera correctly for both streams:
When I start gesture recognition, the WebRTC video stops and vice versa, so they cannot work simultaneously.
One idea I had was to manage the streams in a similar way to this:
https://developer.android.com/training/camera2/multiple-camera-streams-simultaneously#multiple-targets
Adding a target to the capture request, but I can't figure out how to pass a surface instead of calling cameraProvider.bindToLifecycle.
How can I pass the video stream to the analyser?
Is there a better way?