mediapipe icon indicating copy to clipboard operation
mediapipe copied to clipboard

How to call USB camera with official demo?

Open HOLDfoot opened this issue 2 years ago • 6 comments

I have search the issue list and find some relate issue. but I think this is a feature more than a issue.

System information Android 11

  • MediaPipe Solution (you are using): hand and posetracking
  • Programming language : C++/typescript/Python/Objective C/Android Java java
  • Are you willing to contribute it (Yes/No): No

Describe the feature and the current behavior/state: official demo doesn't support usb camera. code from below, and I need use usb camera. https://github.com/jiuqiant/mediapipe_multi_hands_tracking_aar_example

Will this change the current api? How? No

Who will benefit with this feature? android developer who use usb camera but don't want to know more about mediapipe

Please specify the use cases for this feature: mediapipe's official android demo (https://github.com/jiuqiant/mediapipe_multi_hands_tracking_aar_example) only support front and back camera. but we use usb camera. because the code in the demo use camerax, but camera2 or old camera. We need two demos can support usb camera, one is hand demo , another is posetracking demo.

Any Other info: our android device picture

HOLDfoot avatar Jul 12 '22 03:07 HOLDfoot

微信图片_20220712113619

HOLDfoot avatar Jul 12 '22 03:07 HOLDfoot

CameraXPreviewHelper have a very complex way to use camera by camerax api, Because of this, It is difficault for me to change cameraX to old version camera api or camera2 api. Can anybody help me provide me camera2 api or android.hardware.Camera demo?

HOLDfoot avatar Jul 20 '22 03:07 HOLDfoot

I have read some relate issues, such as: Run Mediapipe on background service (Android) #3439 How to use mediapipe with camera1.0 or 2.0 #490 How can I use webcam as input? #10 How to call USB camera ? #1955 How to call USB camera using the CameraXPreviewHelper? #1649

I can't find a ideal solution.

When I search on stackoverflow. Some friends suggest that we recompile mediapipe and edit the source code to get custom aar for android. I am trying this way, but it is still difficult for me.

HOLDfoot avatar Jul 20 '22 05:07 HOLDfoot

If there have some c++ so lib file and doc, which will help me a lot to integrate on android. But I can't find pure so lib and java caller. Maybe I am lazy, but I think there are many people need the demo for usb camera and the demo can run in background service without Activity.

HOLDfoot avatar Jul 20 '22 06:07 HOLDfoot

I read this issue. How to use mediapipe with camera1.0 or 2.0 #490 But i can't agree the comment by @eknight7

We don't have any plans presently to provide Camera2 API support, CameraX is an easier to use API

As one android developer, camera1 is easier than cameraX. Both camera1 and camera2 are easy to expand to usb camera. Can you add a plan which is "provide Camera2 API" to your schedule?

HOLDfoot avatar Jul 20 '22 06:07 HOLDfoot

After my analyse, I find override java file in aar way to run. Just get com.google.mediapipe.components.CameraXPreviewHelper file and create it package and file in the same way. Add com.example.CameraXCapture.MainActivity.MyCameraFilter file. and update CameraXPreviewHelper's code. the two file is as below, now i can access usb camera by cameraX. com.example.CameraXCapture.MainActivity.MyCameraFilter `package com.example.mediapipeposetracking;

import android.annotation.SuppressLint; import android.util.Log;

import androidx.annotation.NonNull; import androidx.camera.core.Camera; import androidx.camera.core.CameraFilter;

import java.util.Iterator; import java.util.LinkedHashSet;

@SuppressLint({"UnsafeExperimentalUsageError", "UnsafeOptInUsageError"}) public class MyCameraFilter implements CameraFilter {

private static final String TAG = "zmr";

@SuppressLint("RestrictedApi")
@NonNull
@Override
public LinkedHashSet<Camera> filter(@NonNull LinkedHashSet<Camera> cameras) {
    Log.i(TAG, "cameras size: " + cameras.size());
    Iterator<Camera> cameraIterator = cameras.iterator();
    Camera camera = null;
    while (cameraIterator.hasNext()) {
        camera = cameraIterator.next();
        String getImplementationType = camera.getCameraInfo().getImplementationType();
        Log.i(TAG, "getImplementationType: " + getImplementationType);
    }
    LinkedHashSet linkedHashSet = new LinkedHashSet<>();
    linkedHashSet.add(camera); // 最后一个camera
    return linkedHashSet;
}

}` com.google.mediapipe.components.CameraXPreviewHelper

`// // Source code recreated from a .class file by IntelliJ IDEA // (powered by FernFlower decompiler) //

package com.google.mediapipe.components;

import android.annotation.SuppressLint; import android.app.Activity; import android.content.Context; import android.graphics.SurfaceTexture; import android.graphics.SurfaceTexture.OnFrameAvailableListener; import android.hardware.camera2.CameraAccessException; import android.hardware.camera2.CameraCharacteristics; import android.hardware.camera2.CameraManager; import android.hardware.camera2.params.StreamConfigurationMap; import android.opengl.GLES20; import android.os.Handler; import android.os.HandlerThread; import android.os.SystemClock; import android.util.Log; import android.util.Size; import android.util.SizeF; import android.view.Surface; import androidx.camera.core.Camera; import androidx.camera.core.CameraSelector; import androidx.camera.core.ImageCapture; import androidx.camera.core.Preview; import androidx.camera.core.UseCase; import androidx.camera.core.ImageCapture.Builder; import androidx.camera.core.ImageCapture.OnImageSavedCallback; import androidx.camera.core.ImageCapture.OutputFileOptions; import androidx.camera.lifecycle.ProcessCameraProvider; import androidx.core.content.ContextCompat; import androidx.lifecycle.LifecycleOwner;

import com.example.mediapipeposetracking.MyCameraFilter; import com.google.common.util.concurrent.ListenableFuture; import com.google.mediapipe.glutil.EglManager; import java.io.File; import java.util.Arrays; import java.util.Iterator; import java.util.List; import java.util.concurrent.Executor; import java.util.concurrent.ExecutorService; import java.util.concurrent.Executors; import java.util.concurrent.RejectedExecutionException; import javax.annotation.Nonnull; import javax.annotation.Nullable; import javax.microedition.khronos.egl.EGLSurface;

public class CameraXPreviewHelper extends CameraHelper { private static final String TAG = "CameraXPreviewHelper"; private static final Size TARGET_SIZE = new Size(1280, 720); private static final int CLOCK_OFFSET_CALIBRATION_ATTEMPTS = 3; private final CameraXPreviewHelper.SingleThreadHandlerExecutor renderExecutor = new CameraXPreviewHelper.SingleThreadHandlerExecutor("RenderThread", 0); private ProcessCameraProvider cameraProvider; private Preview preview; private ImageCapture imageCapture; private Builder imageCaptureBuilder; private ExecutorService imageCaptureExecutorService; private Camera camera; private Size frameSize; private int frameRotation; private boolean isImageCaptureEnabled = false; @Nullable private CameraCharacteristics cameraCharacteristics = null; private float focalLengthPixels = 1.4E-45F; private int cameraTimestampSource = 0;

public CameraXPreviewHelper() {
}

public void startCamera(Activity activity, CameraFacing cameraFacing, SurfaceTexture unusedSurfaceTexture) {
    this.startCamera((Context)activity, (LifecycleOwner)((LifecycleOwner)activity), (CameraFacing)cameraFacing, TARGET_SIZE);
}

public void startCamera(Activity activity, CameraFacing cameraFacing, SurfaceTexture unusedSurfaceTexture, @Nullable Size targetSize) {
    this.startCamera((Context)activity, (LifecycleOwner)((LifecycleOwner)activity), (CameraFacing)cameraFacing, targetSize);
}

public void startCamera(Activity activity, @Nonnull Builder imageCaptureBuilder, CameraFacing cameraFacing, @Nullable Size targetSize) {
    this.imageCaptureBuilder = imageCaptureBuilder;
    this.startCamera((Context)activity, (LifecycleOwner)((LifecycleOwner)activity), (CameraFacing)cameraFacing, targetSize);
}

@SuppressLint("UnsafeOptInUsageError")
public void startCamera(Context context, LifecycleOwner lifecycleOwner, CameraFacing cameraFacing, @Nullable Size targetSize) {
    Log.e(TAG, "startCamera my class");
    Executor mainThreadExecutor = ContextCompat.getMainExecutor(context);
    ListenableFuture<ProcessCameraProvider> cameraProviderFuture = ProcessCameraProvider.getInstance(context);
    targetSize = targetSize == null ? TARGET_SIZE : targetSize;
    Size rotatedSize = new Size(targetSize.getHeight(), targetSize.getWidth());
    cameraProviderFuture.addListener(() -> {
        try {
            this.cameraProvider = (ProcessCameraProvider)cameraProviderFuture.get();
        } catch (Exception var7) {
            if (var7 instanceof InterruptedException) {
                Thread.currentThread().interrupt();
            }

            Log.e(TAG, "Unable to get ProcessCameraProvider: ", var7);
            return;
        }

        this.preview = (new androidx.camera.core.Preview.Builder()).setTargetResolution(rotatedSize).build();
        CameraSelector cameraSelector = cameraFacing == CameraFacing.FRONT ? CameraSelector.DEFAULT_FRONT_CAMERA : CameraSelector.DEFAULT_BACK_CAMERA;
        //cameraSelector = CameraSelector.DEFAULT_BACK_CAMERA;
        // todo 覆盖上述配置, 这一行有时候会印象编译系统对该覆盖类的选择
        cameraSelector = new CameraSelector.Builder().addCameraFilter(new MyCameraFilter()).build(); // 选择最后一个camera
        this.preview.setSurfaceProvider(this.renderExecutor, (request) -> {
            Size resolution = request.getResolution();
            Log.d(TAG, String.format("Received surface request for resolution %dx%d", resolution.getWidth(), resolution.getHeight()));
            SurfaceTexture previewFrameTexture = createSurfaceTexture();
            previewFrameTexture.setDefaultBufferSize(resolution.getWidth(), resolution.getHeight());
            previewFrameTexture.setOnFrameAvailableListener((frameTexture) -> {
                if (frameTexture == previewFrameTexture) {
                    this.onInitialFrameReceived(context, frameTexture);
                }
            }, this.renderExecutor.getHandler());
            Surface surface = new Surface(previewFrameTexture);
            Log.d(TAG, "Providing surface");
            request.provideSurface(surface, this.renderExecutor, (result) -> {
                Log.d(TAG, "Surface request result: " + result);
                previewFrameTexture.release();
                surface.release();
            });
        });
        this.cameraProvider.unbindAll();
        if (this.imageCaptureBuilder != null) {
            this.imageCapture = this.imageCaptureBuilder.build();
            this.camera = this.cameraProvider.bindToLifecycle(lifecycleOwner, cameraSelector, new UseCase[]{this.preview, this.imageCapture});
            this.imageCaptureExecutorService = Executors.newSingleThreadExecutor();
            this.isImageCaptureEnabled = true;
        } else {
            this.camera = this.cameraProvider.bindToLifecycle(lifecycleOwner, cameraSelector, new UseCase[]{this.preview});
        }

    }, mainThreadExecutor);
}

public void takePicture(File outputFile, OnImageSavedCallback onImageSavedCallback) {
    if (this.isImageCaptureEnabled) {
        OutputFileOptions outputFileOptions = (new androidx.camera.core.ImageCapture.OutputFileOptions.Builder(outputFile)).build();
        this.imageCapture.takePicture(outputFileOptions, this.imageCaptureExecutorService, onImageSavedCallback);
    }
}

public boolean isCameraRotated() {
    return this.frameRotation % 180 == 90;
}

public Size computeDisplaySizeFromViewSize(Size viewSize) {
    return this.frameSize;
}

@Nullable
private Size getOptimalViewSize(Size targetSize) {
    if (this.cameraCharacteristics != null) {
        StreamConfigurationMap map = (StreamConfigurationMap)this.cameraCharacteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
        Size[] outputSizes = map.getOutputSizes(SurfaceTexture.class);
        int selectedWidth = -1;
        int selectedHeight = -1;
        float selectedAspectRatioDifference = 1000.0F;
        float targetAspectRatio = (float)targetSize.getWidth() / (float)targetSize.getHeight();
        Size[] var8 = outputSizes;
        int var9 = outputSizes.length;

        for(int var10 = 0; var10 < var9; ++var10) {
            Size size = var8[var10];
            float aspectRatio = (float)size.getWidth() / (float)size.getHeight();
            float aspectRatioDifference = Math.abs(aspectRatio - targetAspectRatio);
            if (aspectRatioDifference <= selectedAspectRatioDifference && (selectedWidth == -1 && selectedHeight == -1 || size.getWidth() <= selectedWidth && size.getWidth() >= this.frameSize.getWidth() && size.getHeight() <= selectedHeight && size.getHeight() >= this.frameSize.getHeight())) {
                selectedWidth = size.getWidth();
                selectedHeight = size.getHeight();
                selectedAspectRatioDifference = aspectRatioDifference;
            }
        }

        if (selectedWidth != -1 && selectedHeight != -1) {
            return new Size(selectedWidth, selectedHeight);
        }
    }

    return null;
}

public long getTimeOffsetToMonoClockNanos() {
    return this.cameraTimestampSource == 1 ? getOffsetFromRealtimeTimestampSource() : getOffsetFromUnknownTimestampSource();
}

private static long getOffsetFromUnknownTimestampSource() {
    return 0L;
}

private static long getOffsetFromRealtimeTimestampSource() {
    long offset = 9223372036854775807L;
    long lowestGap = 9223372036854775807L;

    for(int i = 0; i < 3; ++i) {
        long startMonoTs = System.nanoTime();
        long realTs = SystemClock.elapsedRealtimeNanos();
        long endMonoTs = System.nanoTime();
        long gapMonoTs = endMonoTs - startMonoTs;
        if (gapMonoTs < lowestGap) {
            lowestGap = gapMonoTs;
            offset = (startMonoTs + endMonoTs) / 2L - realTs;
        }
    }

    return offset;
}

public float getFocalLengthPixels() {
    return this.focalLengthPixels;
}

public Size getFrameSize() {
    return this.frameSize;
}

@SuppressLint("RestrictedApi")
private void onInitialFrameReceived(Context context, SurfaceTexture previewFrameTexture) {
    previewFrameTexture.setOnFrameAvailableListener((OnFrameAvailableListener)null);
    previewFrameTexture.updateTexImage();
    previewFrameTexture.detachFromGLContext();
    Log.e(TAG, "preview.getAttachedSurfaceResolution(): " + preview.getAttachedSurfaceResolution() + " frameSize: " + frameSize);
    if (!this.preview.getAttachedSurfaceResolution().equals(this.frameSize)) {
        this.frameSize = this.preview.getAttachedSurfaceResolution();
        this.frameRotation = this.camera.getCameraInfo().getSensorRotationDegrees();
        if (this.frameSize.getWidth() == 0 || this.frameSize.getHeight() == 0) {
            Log.d(TAG, "Invalid frameSize.");
            return;
        }
    }

    Integer selectedLensFacing = this.cameraFacing == CameraFacing.FRONT ? 0 : 1; // todo 恒等于1
    this.cameraCharacteristics = getCameraCharacteristics(context, selectedLensFacing);
    if (this.cameraCharacteristics != null) {
        this.cameraTimestampSource = (Integer)this.cameraCharacteristics.get(CameraCharacteristics.SENSOR_INFO_TIMESTAMP_SOURCE);
        //this.focalLengthPixels = this.calculateFocalLengthInPixels();
    }

    OnCameraStartedListener listener = this.onCameraStartedListener;
    if (listener != null) {
        ContextCompat.getMainExecutor(context).execute(() -> {
            listener.onCameraStarted(previewFrameTexture);
        });
    }

}

private float calculateFocalLengthInPixels() {
    float focalLengthMm = ((float[])this.cameraCharacteristics.get(CameraCharacteristics.LENS_INFO_AVAILABLE_FOCAL_LENGTHS))[0];
    float sensorWidthMm = ((SizeF)this.cameraCharacteristics.get(CameraCharacteristics.SENSOR_INFO_PHYSICAL_SIZE)).getWidth();
    return (float)this.frameSize.getWidth() * focalLengthMm / sensorWidthMm;
}

private static SurfaceTexture createSurfaceTexture() {
    EglManager eglManager = new EglManager((Object)null);
    EGLSurface tempEglSurface = eglManager.createOffscreenSurface(1, 1);
    eglManager.makeCurrent(tempEglSurface, tempEglSurface);
    int[] textures = new int[1];
    GLES20.glGenTextures(1, textures, 0);
    SurfaceTexture previewFrameTexture = new SurfaceTexture(textures[0]);
    return previewFrameTexture;
}

@Nullable
private static CameraCharacteristics getCameraCharacteristics(Context context, Integer lensFacing) {
    CameraManager cameraManager = (CameraManager)context.getSystemService(Context.CAMERA_SERVICE);

    try {
        List<String> cameraList = Arrays.asList(cameraManager.getCameraIdList());
        Iterator var4 = cameraList.iterator();

        while(var4.hasNext()) {
            String availableCameraId = (String)var4.next();
            CameraCharacteristics availableCameraCharacteristics = cameraManager.getCameraCharacteristics(availableCameraId);
            Integer availableLensFacing = (Integer)availableCameraCharacteristics.get(CameraCharacteristics.LENS_FACING);
            Log.e(TAG, "availableCameraId: " + availableCameraId + " availableLensFacing: " + availableLensFacing + " cameraList size: " + cameraList.size());
            if (availableLensFacing != null && availableLensFacing.equals(lensFacing)) {
                return availableCameraCharacteristics;
            }
        }
    } catch (CameraAccessException var8) {
        Log.e(TAG, "Accessing camera ID info got error: " + var8);
    }

    return null;
}

private static final class SingleThreadHandlerExecutor implements Executor {
    private final HandlerThread handlerThread;
    private final Handler handler;

    SingleThreadHandlerExecutor(String threadName, int priority) {
        this.handlerThread = new HandlerThread(threadName, priority);
        this.handlerThread.start();
        this.handler = new Handler(this.handlerThread.getLooper());
    }

    Handler getHandler() {
        return this.handler;
    }

    public void execute(Runnable command) {
        if (!this.handler.post(command)) {
            throw new RejectedExecutionException(this.handlerThread.getName() + " is shutting down.");
        }
    }

    boolean shutdown() {
        return this.handlerThread.quitSafely();
    }
}

} `

HOLDfoot avatar Jul 21 '22 06:07 HOLDfoot

As I have solve it with above solution. Now I close it.

HOLDfoot avatar Aug 19 '22 02:08 HOLDfoot

你好请问一下。MyCameraFilter 这个类用的是哪个版本的 ?我这边是实现CameraFilter 是 public List<CameraInfo> filter(@Nonnull List<CameraInfo> cameraInfos)

xiaojiedeGit avatar Aug 31 '22 08:08 xiaojiedeGit

@HOLDfoot Your code is very cool and it works. However, I met some random crashes when the camera activity starts. Do you have an update on the code? Thanks.

fchen09 avatar Apr 14 '23 02:04 fchen09

@HOLDfoot I have a problem same as yours. Could you tell me the class "MyCameraFilter" project's path, I set the file "MyCameraFilter" into the path "examples/android/src/java/com/google/mediapipe/apps/handtracking" and modified package with "package com.google.mediapipe.apps.handtrackinggpu", modified import with "import com.google.mediapipe.apps.handtrackinggpu.MyCameraFilter;" but build failed with "not found the class MyCameraFilter".

Could you give me a help, thanks a lot.

bugmany avatar Jul 18 '23 03:07 bugmany