react-native-vision-camera icon indicating copy to clipboard operation
react-native-vision-camera copied to clipboard

🐛 VisionCamera v4 + Skia color-matrix frameProcessor crashes after switching filters (Expo SDK 53, RN 0.79, Skia 2.0.7)

Open abs007 opened this issue 4 months ago • 10 comments

What's happening?

Default camera preview is stable when no frame processor is active. Default preview (no filters) is stable but enabling color-matrix filters via Skia and cycling filters (none → monochrome → negative) makes the app terminate after ~5 seconds, even if I stop switching. Native logs show continuous malloc calls, each of which take up 3.5 mb in space. Where am I continuously allocating new mem and how do I correctly release unused memory?

Reproduceable Code

import { AbortButton } from '@/components/AbortButton';
import { ScanRadar } from '@/components/ScanRadar';
import { ToolbarButton } from '@/components/ToolbarButton';
import { colors } from '@/constants/colors';
import { useCameraStore } from '@/store/camera-store';
import { Feather } from '@expo/vector-icons';
import { Skia } from '@shopify/react-native-skia';
import * as Haptics from 'expo-haptics';
import { useRouter } from 'expo-router';
import { StatusBar } from 'expo-status-bar';
import React, { useEffect, useRef, useState } from 'react';
import { ActivityIndicator, Dimensions, Platform, StyleSheet, Text, TouchableOpacity, View } from 'react-native';
import { SafeAreaView } from 'react-native-safe-area-context';
import { Camera, useCameraDevice, useCameraPermission, useSkiaFrameProcessor } from 'react-native-vision-camera';

const { width: screenWidth, height: screenHeight } = Dimensions.get('window');

export default function CameraScreen() {
  const { hasPermission, requestPermission } = useCameraPermission();
  const [facing, setFacing] = useState<'back' | 'front'>('back');
  const device = useCameraDevice(facing);

  const [isFlashOn, setFlashOn] = useState(false);
  const [isProcessing, setIsProcessing] = useState(false);
  const FILTERS = ['none', 'monochrome', 'negative'] as const;
  type FilterId = (typeof FILTERS)[number];
  const [currentFilter, setCurrentFilter] = useState<FilterId>('none');

  const addAnalysisRecord = useCameraStore((state) => state.addAnalysisRecord);
  const router = useRouter();
  const cameraRef = useRef<Camera>(null);

  // Basic color matrices (cheap) for safe, minimal filters
  const MONOCHROME_MATRIX = [
    0.299, 0.587, 0.114, 0, 0,
    0.299, 0.587, 0.114, 0, 0,
    0.299, 0.587, 0.114, 0, 0,
    0, 0, 0, 1, 0,
  ];
  const NEGATIVE_MATRIX = [
    -1, 0, 0, 0, 1,
     0,-1, 0, 0, 1,
     0, 0,-1, 0, 1,
     0, 0, 0, 1, 0,
  ];

  // Only applies when currentFilter !== 'none'. Cheap, color-matrix only.
  const frameProcessor = useSkiaFrameProcessor((frame) => {
    'worklet';
    if (currentFilter === 'none') return;
    const paint = Skia.Paint();
    paint.setAntiAlias(true);
    if (currentFilter === 'monochrome') {
      paint.setImageFilter(Skia.ImageFilter.MakeColorFilter(Skia.ColorFilter.MakeMatrix(MONOCHROME_MATRIX), null));
    } else if (currentFilter === 'negative') {
      paint.setImageFilter(Skia.ImageFilter.MakeColorFilter(Skia.ColorFilter.MakeMatrix(NEGATIVE_MATRIX), null));
    }
    frame.render(paint);
  }, [currentFilter]);

  useEffect(() => {
    if (!hasPermission) {
      requestPermission();
    }
  // console.log(JSON.stringify(device, (k, v) => k === "formats" ? [] : v, 2))
  }, [hasPermission]);

  const handleAbort = () => {
    // router.replace('/');
  };

  const handleCapture = async () => {
    // not relevant here 
  };

  const toggleFlash = () => {
    setFlashOn((v) => !v);
    // Haptics.impactAsync(Haptics.ImpactFeedbackStyle.Light);
  };

  const toggleCameraFacing = () => {
    setFacing((prev) => (prev === 'back' ? 'front' : 'back'));
    // Haptics.impactAsync(Haptics.ImpactFeedbackStyle.Light);
  };

  const cycleFilter = () => {
    const currentIndex = FILTERS.indexOf(currentFilter);
    const nextIndex = (currentIndex + 1) % FILTERS.length;
    const next = FILTERS[nextIndex];
    setCurrentFilter(next);
    // Haptics.impactAsync(Haptics.ImpactFeedbackStyle.Light);
  };

  if (!hasPermission) {
    return (
      <View style={styles.container}>
        <Text style={styles.permissionText}>Requesting camera permission...</Text>
      </View>
    );
  }

  if (!device) {
    return (
      <View style={styles.container}>
        <Text style={styles.permissionText}>No camera device found.</Text>
      </View>
    );
  }

  return (
    <View style={styles.container}>
      <StatusBar style="light" />

      <Camera
        ref={cameraRef}
        style={StyleSheet.absoluteFill}
        device={device}
        isActive={true}
        torch={isFlashOn ? 'on' : 'off'}
        photo={true}
        frameProcessor={currentFilter === 'none' ? undefined : frameProcessor}
      />

      <SafeAreaView style={styles.uiContainer}>
        <>
          <View style={styles.topHUD} />

          <View style={styles.rightToolbar}>
            <ToolbarButton icon={'zap'} onPress={toggleFlash} active={isFlashOn} />
            <ToolbarButton icon={'filter'} onPress={cycleFilter} active={currentFilter !== 'none'} />
          </View>

          {currentFilter !== 'none' && (
            <View style={styles.filterBadge}>
              <Text style={styles.filterBadgeText}>{currentFilter.toUpperCase()}</Text>
            </View>
          )}

          {isProcessing && (
            <View style={styles.processingOverlay}>
              <ActivityIndicator size="large" color={colors.codeGreen} />
              <Text style={styles.processingText}>ANALYZING...</Text>
            </View>
          )}

          <View style={styles.scanningBottomArea}>
            <View style={styles.scanningControls}>
              <View style={styles.abortContainer}>
                <AbortButton onPress={handleAbort} />
              </View>

              <View style={styles.radarContainer}>
                <TouchableOpacity onPress={handleCapture} disabled={isProcessing}>
                  <ScanRadar />
                </TouchableOpacity>
              </View>

              <View style={styles.flipContainer}>
                <TouchableOpacity onPress={toggleCameraFacing} style={styles.flipIconContainer}>
                  <Feather name="rotate-ccw" size={24} color={colors.text} />
                </TouchableOpacity>
                <Text style={styles.flipText}>FLIP</Text>
              </View>
            </View>
          </View>
        </>
      </SafeAreaView>
    </View>
  );
}

const styles = StyleSheet.create({
  container: {
    flex: 1,
    backgroundColor: 'black',
  },
  permissionText: {
    color: 'white',
    textAlign: 'center',
    marginTop: '50%',
    fontFamily: 'ShareTechMono',
  },
  uiContainer: {
    flex: 1,
    zIndex: 10,
  },
  topHUD: {
    height: 50,
  },
  rightToolbar: {
    position: 'absolute',
    top: 100,
    right: 16,
    alignItems: 'center',
  },
  scanningBottomArea: {
    position: 'absolute',
    bottom: 0,
    left: 0,
    right: 0,
    paddingBottom: 30,
    paddingTop: 20,
    backgroundColor: 'rgba(0,0,0,0.4)',
  },
  scanningControls: {
    flexDirection: 'row',
    justifyContent: 'space-around',
    alignItems: 'center',
  },
  abortContainer: {
    width: 60,
    height: 60,
    justifyContent: 'center',
    alignItems: 'center',
  },
  radarContainer: {
    justifyContent: 'center',
    alignItems: 'center',
  },
  flipContainer: {
    width: 60,
    height: 60,
    justifyContent: 'center',
    alignItems: 'center',
  },
  flipIconContainer: {
    width: 44,
    height: 44,
    borderRadius: 22,
    backgroundColor: 'rgba(255, 255, 255, 0.2)',
    justifyContent: 'center',
    alignItems: 'center',
  },
  flipText: {
    color: colors.text,
    fontFamily: 'ShareTechMono',
    fontSize: 10,
    marginTop: 4,
  },
  processingOverlay: {
    ...StyleSheet.absoluteFillObject,
    backgroundColor: 'rgba(0,0,0,0.7)',
    justifyContent: 'center',
    alignItems: 'center',
    zIndex: 1000,
  },
  processingText: {
    color: colors.codeGreen,
    fontFamily: 'ShareTechMono',
    fontSize: 16,
    marginTop: 16,
  },
  filterBadge: {
    position: 'absolute',
    top: 60,
    alignSelf: 'center',
    backgroundColor: 'rgba(0,0,0,0.6)',
    paddingHorizontal: 10,
    paddingVertical: 4,
    borderRadius: 6,
  },
  filterBadgeText: {
    color: '#fff',
    fontFamily: 'ShareTechMono',
    fontSize: 12,
    letterSpacing: 1,
  },
});

Relevant log output

21:30:53.030: [info] 📸 VisionCamera.didSetProps(_:): Updating 24 props: [onInitialized, cameraId, position, enableBufferCompression, preview, onStarted, onCodeScanned, top, onOutputOrientationChanged, right, isActive, isMirrored, onViewReady, onError, onStopped, onPreviewOrientationChanged, onPreviewStarted, onPreviewStopped, enableFrameProcessor, left, bottom, torch, onShutter, photo]
21:30:53.032: [info] 📸 VisionCamera.configurePreviewOrientation(_:): Updating Preview rotation: portrait...
21:30:53.032: [info] 📸 VisionCamera.configureOutputOrientation(_:): Updating Outputs rotation: portrait...
21:30:53.032: [info] 📸 VisionCamera.configure(_:): configure { ... }: Waiting for lock...
21:30:53.037: [info] 📸 VisionCamera.configure(_:): configure { ... }: Updating CameraSession Configuration... Difference(inputChanged: true, outputsChanged: true, videoStabilizationChanged: true, orientationChanged: true, formatChanged: true, sidePropsChanged: true, torchChanged: true, zoomChanged: true, exposureChanged: true, audioSessionChanged: true, locationChanged: true)
21:30:53.037: [info] 📸 VisionCamera.configureDevice(configuration:): Configuring Input Device...
21:30:53.037: [info] 📸 VisionCamera.configureDevice(configuration:): Configuring Camera com.apple.avfoundation.avcapturedevice.built-in_video:0...
21:30:53.043: [info] 📸 VisionCamera.configureDevice(configuration:): Successfully configured Input Device!
21:30:53.043: [info] 📸 VisionCamera.configureOutputs(configuration:): Configuring Outputs...
21:30:53.043: [info] 📸 VisionCamera.configureOutputs(configuration:): Adding Photo output...
21:30:53.045: [info] 📸 VisionCamera.configurePreviewOrientation(_:): Updating Preview rotation: portrait...
21:30:53.045: [info] 📸 VisionCamera.configureOutputOrientation(_:): Updating Outputs rotation: portrait...
21:30:53.045: [info] 📸 VisionCamera.configureOutputs(configuration:): Successfully configured all outputs!
21:30:53.047: [info] 📸 VisionCamera.setTargetOutputOrientation(_:): Setting target output orientation from device to device...
21:30:54.121: [info] 📸 VisionCamera.init(frame:session:): Preview Layer started previewing.
21:30:54.121: [info] 📸 VisionCamera.configure(_:): Beginning AudioSession configuration...
21:30:54.121: [info] 📸 VisionCamera.configureAudioSession(configuration:): Configuring Audio Session...
21:30:54.122: [info] 📸 VisionCamera.configure(_:): Beginning Location Output configuration...
21:30:54.122: [info] 📸 VisionCamera.configure(_:): Committed AudioSession configuration!
21:30:54.134: [info] 📸 VisionCamera.configure(_:): Finished Location Output configuration!
🟢 Creating JS object for module 'ExpoHaptics'
21:31:02.561: [info] 📸 VisionCamera.didSetProps(_:): Updating 2 props: [preview, enableFrameProcessor]
21:31:02.561: [info] 📸 VisionCamera.configure(_:): configure { ... }: Waiting for lock...
21:31:02.567: [info] 📸 VisionCamera.configure(_:): configure { ... }: Updating CameraSession Configuration... Difference(inputChanged: false, outputsChanged: true, videoStabilizationChanged: true, orientationChanged: true, formatChanged: false, sidePropsChanged: false, torchChanged: false, zoomChanged: false, exposureChanged: false, audioSessionChanged: false, locationChanged: false)
21:31:02.567: [info] 📸 VisionCamera.configureOutputs(configuration:): Configuring Outputs...
21:31:02.568: [info] 📸 VisionCamera.configureOutputs(configuration:): Adding Photo output...
21:31:02.569: [info] 📸 VisionCamera.configureOutputs(configuration:): Adding Video Data output...
21:31:02.655: [info] 📸 VisionCamera.configurePreviewOrientation(_:): Updating Preview rotation: portrait...
21:31:02.655: [info] 📸 VisionCamera.configureOutputOrientation(_:): Updating Outputs rotation: portrait...
21:31:02.655: [info] 📸 VisionCamera.configureOutputs(configuration:): Successfully configured all outputs!
21:31:02.664: [info] 📸 VisionCamera.setTargetOutputOrientation(_:): Setting target output orientation from device to device...
21:31:02.670: [info] 📸 VisionCamera.getPixelFormat(for:): Available Pixel Formats: ["420v", "420f", "BGRA", "&8v0", "-8v0", "&8f0", "-8f0", "&BGA", "-BGA"], finding best match... (pixelFormat="yuv", enableHdr={false}, enableBufferCompression={false})
21:31:02.670: [info] 📸 VisionCamera.getPixelFormat(for:): Using PixelFormat: 420f...
interruptionHandler is called. -[FontServicesDaemonManager connection]_block_invoke

Camera Device

{
  "hardwareLevel": "full",
  "minExposure": -8,
  "minZoom": 1,
  "neutralZoom": 1,
  "position": "back",
  "hasFlash": true,
  "sensorOrientation": "portrait",
  "maxExposure": 8,
  "id": "com.apple.avfoundation.avcapturedevice.built-in_video:0",
  "physicalDevices": [
    "wide-angle-camera"
  ],
  "name": "Back Camera",
  "hasTorch": true,
  "isMultiCam": false,
  "maxZoom": 123.75,
  "supportsFocus": true,
  "formats": [],
  "supportsLowLightBoost": false,
  "supportsRawCapture": false,
  "minFocusDistance": 15
}

Device

Iphone 15 Plus, ios 18.6

VisionCamera Version

4.7.0

Can you reproduce this issue in the VisionCamera Example app?

No, I cannot reproduce the issue in the Example app

Additional information

abs007 avatar Aug 19 '25 19:08 abs007

Guten Tag, Hans here! 🍻

This issue seems to be well written and includes lots of details, including relevant logs and code. Since you are able to reproduce ze problem but not in ze Example app, it might be a configuration issue specific to your setup.

It would be helpful to check if there are any updates available for your dependencies and make sure everything is aligned properly with ze latest versions. If ze problem persists, I recommend discussing it further with ze community in ze GitHub discussions or consider sponsoring mrousavy so he can prioritize looking into your issue: sponsor mrousavy.

Keep us updated on your progress!

Note: If you think I made a mistake, please ping @mrousavy to take a look.

maintenance-hans[bot] avatar Aug 19 '25 19:08 maintenance-hans[bot]

The reason why I couldnt reproduce it in the example app is because these are the instructions for running the ios version of the app but the cmd fails due to the example dir not being inside the package dir

git clone https://github.com/mrousavy/react-native-vision-camera
cd react-native-vision-camera/package
bun bootstrap

abs007 avatar Aug 19 '25 19:08 abs007

Also, the bot comment above mentions a Github discussions. I dont see any for this project though

abs007 avatar Aug 19 '25 19:08 abs007

Same issue here @mrousavy can you please have a check?

Derewith avatar Aug 20 '25 03:08 Derewith

Running into the same issue here. @abs007 did you figure out any new information in the meantime?

toonverbeek avatar Aug 27 '25 08:08 toonverbeek

No solution yet sadly but I did see this talk from @mrousavy: https://www.youtube.com/watch?v=BqKEyKleyIA It goes into the specifics of drawing onto the camera using Frame Processors. You might get an answer here but its unlikely. Do tell if you find something useful.

abs007 avatar Aug 27 '25 15:08 abs007

There's some very strange behaviour going on. I have a complex skia-processor that was working flawlessly. Then I changed some arbitrary code unrelated to the frame processing and it started crashing with the same behaviour as @abs007 describes.

The mind boggling issue for me is that even after I remove the code and revert back to code that was working without issue before, it still crashes. I've tried:

  • npx expo prebuild --clean
  • removing the app from the device (I'm running on a physical device using npx expo ios --device)

The app starts working again after I remove the useSkiaProcessor hook. Even with a callback that simply renders the frame, it still crashes. Again, previously it was working flawlessly.

Right now I can't get it back to a working state. It's like the development build itself gets stuck in a malloc loop.

toonverbeek avatar Sep 03 '25 12:09 toonverbeek

@mrousavy can you check please?

Derewith avatar Sep 30 '25 02:09 Derewith

How did you get skia 2.x working with react-native-vision-camera and worklets? Isn't there a namespace conflict on worklets?

EDIT: I ran a test over the weekend and got it working w/out any conflicts. I upgraded to expo 53/react 19 and tested skia 2.0.0-next, 2.2. All still have a pretty bad memory leak on iOS for me.

I did a ton of refactoring to make all of my skia operations far more performant, and used shared values for pretty much everything in the worklet but also w/out luck. I patched RNVC canvas.onlayout to try out version skia 2.21 onwards. I was able to get a camera frame somewhat inconsistently and the iOS memory graph was not growing; however, couldn't see the camera consistently enough to verify if the drawing was working or not.

therealpurplemana avatar Oct 06 '25 17:10 therealpurplemana

Probably not fixing OPs issue but https://github.com/mrousavy/react-native-vision-camera/pull/3668 will help with latest skia from my understanding 🤓

CanRau avatar Nov 19 '25 17:11 CanRau

skiaFrameProcessor isn't working yet. On iOS, even if you patch the broken "onLayout" it will still crash or freeze (memory leak). On android 10% of devices will not work (black screen or exception). I've done a lot of research and testing to get my production app working, but I can assure skiaFrameProcessor doesn't work with vision-camera as of the latest version 4.7.3 with skia 2.4.x. My only fix is disabling frameprocessor on android and set format to 2048 with fps 15 on ios to avoid crash (but it looks horrible).

maxximee avatar Dec 18 '25 04:12 maxximee