flutter_google_ml_vision icon indicating copy to clipboard operation
flutter_google_ml_vision copied to clipboard

FaceDetector randomly crashing APP on iOS

Open Tomas-TravelUnion opened this issue 4 years ago • 5 comments

*** -[NSMutableArray addObjectsFromArray:]: array argument is not an NSArray *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[NSMutableArray addObjectsFromArray:]: array argument is not an NSArray' *** First throw call stack: (0x19998586c 0x1ae9a0c50 0x1999f5e1c 0x1999fc0ec 0x19986f1f4 0x106c23054 0x106c22334 0x19957824c 0x199579db0 0x1995877ac 0x19990111c 0x1998fb120 0x1998fa21c 0x1b14c6784 0x19c33aee8 0x19c34075c 0x104283b84 0x1995ba6b0) libc++abi.dylib: terminating with uncaught exception of type NSException

  • thread #1, queue = 'com.apple.main-thread', stop reason = signal SIGABRT frame #0: 0x00000001c78da414 libsystem_kernel.dylib`__pthread_kill + 8

Tomas-TravelUnion avatar Jun 01 '21 15:06 Tomas-TravelUnion

Yes. I have the same issue with this plugin.

CoderJava avatar Jun 02 '21 03:06 CoderJava

Mee tooooooo.... My lib version is google_ml_vision: ^0.0.5

159159951 avatar Jun 04 '21 15:06 159159951

I am trying to replicate the crash, can you supply the face detector code you are using?

brianmtully avatar Jun 05 '21 15:06 brianmtully

Same issue here! I'm using Flutter CameraController. Each time the method controller.startImageStream() is called, the image taken from CameraPreview is saved and processed in order to creare an image metadata object:

CameraImage? mlCameraImage;
GoogleVisionImageMetadata? mlMetaData;
  
Future<void> setInputImage(CameraImage image, int rotationDegrees) async {
    mlCameraImage = image;
    late ImageRotation rotation;
    switch(rotationDegrees) {
      case 0:
        rotation = ImageRotation.rotation0;
        break;
      case 90:
        rotation = ImageRotation.rotation90;
        break;
      case 180:
        rotation = ImageRotation.rotation180;
        break;
      case 270:
        rotation = ImageRotation.rotation270;
        break;
    }
    mlMetaData = GoogleVisionImageMetadata(
        rawFormat: image.format.raw,
        size: Size(image.width.toDouble(),image.height.toDouble()),
        planeData: image.planes.map((currentPlane) => GoogleVisionImagePlaneMetadata(
            bytesPerRow: currentPlane.bytesPerRow,
            height: currentPlane.height,
            width: currentPlane.width
        )).toList(),
        rotation: rotation,
    );
  }

Then I use mlCameraImage and mlMetaData as input values for face detection algorithm.

Actually I can't share my full code, but after some tests I found that it seems like the crash occurs only when a portion of the face is processed (for example, when the picture contains only half of user's face).

My detector is

_mlDetector = GoogleVision.instance.faceDetector(
        FaceDetectorOptions(enableClassification: true,
        enableContours: true)
    );

My lib version is google_ml_vision: ^0.0.5, and my device is an iPhone 6 with iOS 12.5.3.

nicola-sarzimadidini avatar Jun 09 '21 09:06 nicola-sarzimadidini

Edit no.1:

Checked through Crashlytics tool, seems like the crash is located here:

Runner
FaceDetector.m - Line 143
+[FaceDetector getAllContourPoints:] + 143

Runner
FaceDetector.m - Line 77
__39-[FaceDetector handleDetection:result:]_block_invoke + 77

Hope this might help you.

nicola-sarzimadidini avatar Jun 10 '21 11:06 nicola-sarzimadidini