mediapipe icon indicating copy to clipboard operation
mediapipe copied to clipboard

object_detection_3d: SIGSEGV when nativeStartRunningGraph is called

Open dilldilldill opened this issue 2 years ago • 6 comments

I am integrating mediapipe into my WebRTC based Android application written in Java. I tried to get the most simple edge detection example working with my app to verify the basic mediapipe setup is working, which it did.

I then built an AAR file for mediapipe/graphs/object_detection_3d:mobile_calculators and tried to use the graph binarypb file from the object_occlusion_tracking.pbtxt in order to get shoe detection working.

Procedure should be very similar to the edge detection example but with additional side packets (like obj_asset_name, box_asset_name, model_scale etc.) and packets for input_width and input_height. I tried to stay as close as possible to the objectdetection3d example for this (see code below).

My asset folder now contains:

  1. box.obj.uuu
  2. classic_colors.png
  3. mobile_gpu_binary_graph.binarypb
  4. model.obj.uuu
  5. texture.jpg

I put side packets, add packets to input stream and prepare demo assets exactly as the example app does.

When I try to run the app I get a SIGSEGV error:

I/native: I20220721 11:37:31.700362 16409 graph.cc:478] Start running the graph, waiting for inputs. I/native: I20220721 11:37:31.700467 16409 gl_context_egl.cc:84] Successfully initialized EGL. Major : 1 Minor: 5 I/native: I20220721 11:37:31.701685 16542 gl_context.cc:335] GL version: 3.2 (OpenGL ES 3.2 [email protected] (GIT@5a9022f91f, Ib11adbd47c, 1627309424) (Date:07/26/21)) I/native: I20220721 11:37:31.711827 16403 jni_util.cc:41] GetEnv: not attached A/libc: Fatal signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x0 in tid 16403 (Thread-29), pid 16273 (app.test)

I debugged my code to pin down which call is resposible for the error and it's line 186 inside the startRunningGraph method of Graph class when nativeStartRunningGraph is called.

The error just doesn't give me any hint what is going wrong and I don't know what else I could try at the moment.

See below my code that differs from the working edge detection example:

AndroidAssetUtil.initializeNativeAssetManager(appContext);

this.eglManager = new EglManager(null);
this.frameProcessor = new FrameProcessor(
        appContext,
        eglManager.getNativeContext(),
        "mobile_gpu_binary_graph.binarypb",
        "input_video", "output_video"
);

BitmapFactory.Options decodeOptions = new BitmapFactory.Options();
decodeOptions.inScaled = false;
decodeOptions.inDither = false;
decodeOptions.inPremultiplied = false;

try {
    InputStream inputStream = context.getAssets().open(OBJ_TEXTURE);
    objTexture = BitmapFactory.decodeStream(inputStream, null /*outPadding*/, decodeOptions);
    inputStream.close();
} catch (Exception e) {
    Log.e(TAG, "Error parsing object texture; error: " + e);
    throw new IllegalStateException(e);
}

try {
    InputStream inputStream = context.getAssets().open(BOX_TEXTURE);
    boxTexture = BitmapFactory.decodeStream(inputStream, null /*outPadding*/, decodeOptions);
    inputStream.close();
} catch (Exception e) {
    Log.e(TAG, "Error parsing box texture; error: " + e);
    throw new RuntimeException(e);
}

AndroidPacketCreator packetCreator = this.frameProcessor.getPacketCreator();
Map<String, Packet> inputSidePackets = new HashMap<>();
inputSidePackets.put("obj_asset_name", packetCreator.createString(OBJ_FILE));
inputSidePackets.put("box_asset_name", packetCreator.createString(BOX_FILE));
inputSidePackets.put("obj_texture", packetCreator
        .createRgbaImageFrame(this.objTexture));
inputSidePackets.put("box_texture", packetCreator
        .createRgbaImageFrame(this.boxTexture));
inputSidePackets.put("allowed_labels", packetCreator.createString("Footwear"));
inputSidePackets.put("max_num_objects", packetCreator.createInt32(4));
inputSidePackets.put("model_scale", packetCreator.createFloat32Array(
        parseFloatArrayFromString("0.25, 0.25, 0.12")));
inputSidePackets.put("model_transformation", packetCreator.createFloat32Array(
                parseFloatArrayFromString(
                        "1.0,  0.0, 0.0, 0.0,\n" +
                        "0.0,  0.0, 1.0, 0.0,\n" +
                        "0.0, -1.0, 0.0, 0.0,\n" +
                        "0.0,  0.0, 0.0, 1.0")
        )
);
this.frameProcessor.setInputSidePackets(inputSidePackets);

this.frameProcessor.setOnWillAddFrameListener((timestamp) -> {
      try {
          Packet widthPacket = frameProcessor.getPacketCreator().createInt32(textureWidth);
          Packet heightPacket = frameProcessor.getPacketCreator().createInt32(textureHeight);
      
          try {
              frameProcessor.getGraph().addPacketToInputStream("input_width", widthPacket, timestamp);
              frameProcessor.getGraph().addPacketToInputStream("input_height", heightPacket, timestamp);
      
          } catch (RuntimeException e) {
              Log.e(TAG, "MediaPipeException encountered adding " +
                      "packets to input_width and input_height input streams.", e);
          }
          widthPacket.release();
          heightPacket.release();
      
      } catch (IllegalStateException ise) {
          Log.e(TAG, "Exception while adding packets to width and height input streams.");
      }
});

dilldilldill avatar Jul 21 '22 09:07 dilldilldill

Hi @dilldilldill , Could you provide steps to reproduce this issue.

sureshdagooglecom avatar Jul 26 '22 06:07 sureshdagooglecom

I tried to explain all the steps to reproduce this issue in my OP. This is the whole code to produces the error. For assets and graph etc. used, please see OP.

public class MainActivity extends AppCompatActivity {
    private SurfaceTexture previewFrameTexture;
    private SurfaceView previewDisplayView;
    private CameraXPreviewHelper cameraHelper;

    private FrameProcessor frameProcessor;
    private EglManager eglManager;
    private ExternalTextureConverter textureConverter;

    private static final String OBJ_TEXTURE = "texture.jpg";
    private static final String OBJ_FILE = "model.obj.uuu";
    private static final String BOX_TEXTURE = "classic_colors.png";
    private static final String BOX_FILE = "box.obj.uuu";

    private Bitmap objTexture = null;
    private Bitmap boxTexture = null;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        System.loadLibrary("opencv_java3");
        System.loadLibrary("mediapipe_jni");
        AndroidAssetUtil.initializeNativeAssetManager(this);

        previewDisplayView = new SurfaceView(this);
        previewDisplayView.getHolder().addCallback(new SurfaceHolder.Callback() {
            @Override
            public void surfaceCreated(@NonNull SurfaceHolder holder) {}

            @Override
            public void surfaceChanged(@NonNull SurfaceHolder holder, int format, int width, int height) {
                Size viewSize = new Size(width, height);
                Size displaySize = cameraHelper.computeDisplaySizeFromViewSize(viewSize);

                frameProcessor.getVideoSurfaceOutput().setSurface(holder.getSurface());
                frameProcessor.setOnWillAddFrameListener((timestamp) -> {
                    try {
                        int cameraTextureWidth = displaySize.getWidth();
                        int cameraTextureHeight = displaySize.getHeight();

                        Packet widthPacket = frameProcessor.getPacketCreator().createInt32(cameraTextureWidth);
                        Packet heightPacket = frameProcessor.getPacketCreator().createInt32(cameraTextureHeight);

                        try {
                            frameProcessor.getGraph().addPacketToInputStream("input_width", widthPacket, timestamp);
                            frameProcessor.getGraph().addPacketToInputStream("input_height", heightPacket, timestamp);

                        } catch (RuntimeException e) {
                            Log.e(TAG, "MediaPipeException encountered adding " +
                                    "packets to input_width and input_height input streams.", e);
                        }
                        widthPacket.release();
                        heightPacket.release();

                    } catch (IllegalStateException ise) {
                        Log.e(TAG, "Exception while adding packets " +
                                "to width and height input streams.");
                    }
                });

                textureConverter.setDestinationSize(displaySize.getWidth(), displaySize.getHeight());
            }

            @Override
            public void surfaceDestroyed(@NonNull SurfaceHolder holder) {

            }
        });
        previewDisplayView.setVisibility(View.GONE);
        ViewGroup viewGroup = findViewById(R.id.preview_display_layout);
        viewGroup.addView(previewDisplayView);

        this.eglManager = new EglManager(null);

        this.frameProcessor = new FrameProcessor(
                this,
                eglManager.getNativeContext(),
                "mobile_gpu_binary_graph.binarypb",
                "input_video", "output_video"
        );
        prepareDemoAssets();
        AndroidPacketCreator packetCreator = this.frameProcessor.getPacketCreator();
        Map<String, Packet> inputSidePackets = new HashMap<>();
        inputSidePackets.put("obj_asset_name", packetCreator.createString(OBJ_FILE));
        inputSidePackets.put("box_asset_name", packetCreator.createString(BOX_FILE));
        inputSidePackets.put("obj_texture", packetCreator.createRgbaImageFrame(this.objTexture));
        inputSidePackets.put("box_texture", packetCreator.createRgbaImageFrame(this.boxTexture));
        inputSidePackets.put("allowed_labels", packetCreator.createString("Footwear"));
        inputSidePackets.put("max_num_objects", packetCreator.createInt32(2));
        inputSidePackets.put("model_scale", packetCreator.createFloat32Array(parseFloatArrayFromString("0.25, 0.25, 0.12")));
        inputSidePackets.put("model_transformation", packetCreator.createFloat32Array(parseFloatArrayFromString(
                "1.0,  0.0, 0.0, 0.0,\n" +
                        "0.0,  0.0, 1.0, 0.0,\n" +
                        "0.0, -1.0, 0.0, 0.0,\n" +
                        "0.0,  0.0, 0.0, 1.0"))
        );
        this.frameProcessor.setInputSidePackets(inputSidePackets);

        this.textureConverter = new ExternalTextureConverter(this.eglManager.getContext());
        this.textureConverter.setConsumer(this.frameProcessor);

        PermissionHelper.checkAndRequestCameraPermissions(this);
    }

    @Override
    public void onRequestPermissionsResult(
            int requestCode, String[] permissions, int[] grantResults) {
        super.onRequestPermissionsResult(requestCode, permissions, grantResults);
        PermissionHelper.onRequestPermissionsResult(requestCode, permissions, grantResults);
    }

    @Override
    protected void onResume() {
        super.onResume();

        if (PermissionHelper.cameraPermissionsGranted(this)) {
            startCamera();
        }
    }

    private static float[] parseFloatArrayFromString(String string) {
        String[] elements = string.split(",", -1);
        float[] array = new float[elements.length];
        for (int i = 0; i < elements.length; ++i) {
            array[i] = Float.parseFloat(elements[i]);
        }
        return array;
    }

    private void prepareDemoAssets() {
        // We render from raw data with openGL, so disable decoding preprocessing
        BitmapFactory.Options decodeOptions = new BitmapFactory.Options();
        decodeOptions.inScaled = false;
        decodeOptions.inDither = false;
        decodeOptions.inPremultiplied = false;

        try {
            InputStream inputStream = getAssets().open(OBJ_TEXTURE);
            objTexture = BitmapFactory.decodeStream(inputStream, null /*outPadding*/, decodeOptions);
            inputStream.close();
        } catch (Exception e) {
            Log.e(TAG, "Error parsing object texture; error: " + e);
            throw new IllegalStateException(e);
        }

        try {
            InputStream inputStream = getAssets().open(BOX_TEXTURE);
            boxTexture = BitmapFactory.decodeStream(inputStream, null /*outPadding*/, decodeOptions);
            inputStream.close();
        } catch (Exception e) {
            Log.e(TAG, "Error parsing box texture; error: " + e);
            throw new RuntimeException(e);
        }
    }

    public void startCamera() {
        cameraHelper = new CameraXPreviewHelper();
        previewFrameTexture = textureConverter.getSurfaceTexture();
        cameraHelper.setOnCameraStartedListener(
                surfaceTexture -> {
                    previewDisplayView.setVisibility(View.VISIBLE);
                });

        CameraHelper.CameraFacing cameraFacing = CameraHelper.CameraFacing.BACK;
        cameraHelper.startCamera(this, cameraFacing, previewFrameTexture);
    }

    @Override
    protected void onPause() {
        super.onPause();
        if (textureConverter != null) {
            textureConverter.close();
        }
    }
}

dilldilldill avatar Jul 26 '22 08:07 dilldilldill

Does the code look okay? Are there any obvious mistakes? Or have I made any mistakes in building the aar file from mediapipe/graphs/object_detection_3d:mobile_calculators and the binarypb file from object_occlusion_tracking.pbtxt? Any help would be appreciated.

dilldilldill avatar Aug 01 '22 12:08 dilldilldill

I am working on a similiar solution and i am getting pretty much the same SIGSEGV. Any ideas?

PonyHugger avatar Aug 01 '22 12:08 PonyHugger

If there is no obvious mistake in my code, can some one maybe provide prebuilt aar files for shoe object_detection_3d and the corresponding graph files?

dilldilldill avatar Aug 16 '22 08:08 dilldilldill

Thanks for investigating where the segfault occurs. I see that you have native log output, and native Graph::StartRunningGraph reaches "Start running the graph, waiting for inputs.". This means that the CalculatorGraph has successfully initialized, and all of the calculators referenced in the CalculatorGraphConfig have successfully resolved. I imagine that one of the Calculator nodes is failing to "Open" successfully, and unfortunately fails with a segfault before returning an error message from CalculatorGraph::StartRun. I imagine one of the Calculator nodes is missing a necessary resource or running on an unsupported platform.

If we could see the native stack trace, we would at least know which Calculator::Open function is failing. Let me know if you can find the native stack trace in any way. If not, maybe we can work on surfacing that information through the MediaPipe Java bindings.

hadon avatar Sep 07 '22 06:09 hadon

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you.

google-ml-butler[bot] avatar Jan 13 '23 07:01 google-ml-butler[bot]

Closing as stale. Please reopen if you'd like to work on this further.

google-ml-butler[bot] avatar Jan 20 '23 08:01 google-ml-butler[bot]

Are you satisfied with the resolution of your issue? Yes No

google-ml-butler[bot] avatar Jan 20 '23 08:01 google-ml-butler[bot]