react-native-vision-camera icon indicating copy to clipboard operation
react-native-vision-camera copied to clipboard

šŸ› Black camera screen when use frameProcessor

Open kevinzDzul opened this issue 6 months ago • 1 comments

What's happening?

I’m having the following issue in my application: When I use the frameProcessor property on the <Camera /> component, the camera preview turns very dark or almost black. I can still distinguish shapes, but the preview is much darker than normal. When I remove the frameProcessor property, the camera works perfectly fine with good resolution and normal brightness.

Why does this happen? For context, my app is locked in portrait mode so the screen cannot rotate.

Reproduceable Code

import {
  Camera,
  CameraPosition,
  useCameraDevice,
  useCameraFormat,
  useFrameProcessor,
} from 'react-native-vision-camera';
import { useEffect, useRef, useState } from 'react';
import { View, StyleSheet, Dimensions, Text } from 'react-native';
import { useTheme } from '@siga/context/themeProvider';
import {
  Face,
  useFaceDetector,
  FaceDetectionOptions,
} from 'react-native-vision-camera-face-detector';
import { useSharedValue, Worklets } from 'react-native-worklets-core';
import { reportError } from '@siga/util/reportError';
import { useResizePlugin } from 'vision-camera-resize-plugin';
import useEfficientDetModel from '@siga/hooks/useEfficientDetModel';

interface Props {
  position?: CameraPosition;
  onCapture: (originalPath: string, resizedFrameData: Float32Array) => void;
  showCircleFace?: boolean;
}

const { width: SCREEN_W, height: SCREEN_H } = Dimensions.get('window');

export default function CameraView({ position = 'front', onCapture, showCircleFace }: Props) {
  const camera = useRef<Camera>(null);
  const { resize } = useResizePlugin();
  const device = useCameraDevice(position);
  const format = useCameraFormat(device, [{ fps: 15, iso: 'min' }]);
  const theme = useTheme();

  const [hasPermission, setHasPermission] = useState(false);
  const [message, setMessage] = useState('Buscando rostro...');
  const [done, setDone] = useState(false);

  const countdownRef = useRef<ReturnType<typeof setInterval> | null>(null);
  const isCounting = useRef(false);

  const { model } = useEfficientDetModel();
  const vectorData = useSharedValue<any>([]);

  const faceDetectionOptions = useRef<FaceDetectionOptions>({
    performanceMode: 'fast',
    classificationMode: 'all',
    windowWidth: SCREEN_W,
    windowHeight: SCREEN_H,
  }).current;
  const { detectFaces } = useFaceDetector(faceDetectionOptions);

  useEffect(() => {
    (async () => {
      const permission = await Camera.requestCameraPermission();
      setHasPermission(permission !== 'denied');
    })();
  }, []);

  const takePhoto = async () => {
    if (camera.current && vectorData.value && !done) {
      try {
        const photo = await camera.current.takeSnapshot();
        onCapture(photo.path, vectorData.value);
        setMessage('Foto tomada exitosamente');
        setDone(true);
        isCounting.current = false;
      } catch (error) {
        reportError(error);
      }
    }
  };

  const startCountdown = () => {
    if (isCounting.current) return;

    isCounting.current = true;
    let seconds = 5;
    setMessage(`Foto en ${seconds}...`);

    countdownRef.current = setInterval(() => {
      seconds--;
      setMessage(`Foto en ${seconds}...`);

      if (seconds === 0) {
        clearInterval(countdownRef.current!);
        countdownRef.current = null;
        takePhoto(); // āœ… ya no se valida si hay rostro
      }
    }, 1000);
  };

  const handleFrame = Worklets.createRunOnJS((faces: Face[]) => {
    if (done) return;

    const detected = faces.length > 0;

    if (detected && !isCounting.current) {
      startCountdown();
    } else if (!detected && !isCounting.current) {
      setMessage('Buscando rostro...');
    }
  });

  const frameProcessor = useFrameProcessor((frame) => {
    'worklet';
    const faces = detectFaces(frame);
    handleFrame(faces);

    if (!model || faces.length === 0) return;

    const raw = resize(frame, {
      scale: { width: 160, height: 160 },
      crop: {
        x: faces[0].bounds.y,
        y: faces[0].bounds.x,
        width: faces[0].bounds.width,
        height: faces[0].bounds.height,
      },
      rotation: '270deg',
      pixelFormat: 'rgb',
      dataType: 'float32',
    });

    const detector = model.runSync([raw]);
    vectorData.value = detector[0];
  }, [handleFrame]);

  useEffect(() => {
    return () => {
      if (countdownRef.current) {
        clearInterval(countdownRef.current);
      }
    };
  }, []);

  if (!device || !hasPermission) return null;

  return (
    <View style={styles.container}>
      <Camera
        ref={camera}
        style={styles.camera}
        device={device}
        isActive={!done}
        frameProcessor={frameProcessor}
        pixelFormat="yuv"
        focusable
        isMirrored={false}
        outputOrientation="device"
        format={format}
      />

      {showCircleFace && !done && <View style={styles.circleOverlay} />}

      <View style={styles.messageContainer}>
        <Text style={[styles.messageText, { color: theme.colors.onPrimary }]}>
          {message}
        </Text>
      </View>
    </View>
  );
}

const styles = StyleSheet.create({
  container: { flex: 1 },
  camera: { flex: 1 },
  circleOverlay: {
    position: 'absolute',
    borderStyle: 'dashed',
    top: SCREEN_H / 2 - 240,
    left: SCREEN_W / 2 - 140,
    width: 280,
    height: 400,
    borderRadius: 200,
    borderWidth: 3,
    borderColor: 'white',
    backgroundColor: 'transparent',
    zIndex: 10,
  },
  messageContainer: {
    position: 'absolute',
    top: 50,
    alignSelf: 'center',
    backgroundColor: 'rgba(0,0,0,0.5)',
    padding: 10,
    borderRadius: 8,
    zIndex: 20,
  },
  messageText: {
    fontSize: 18,
    fontWeight: 'bold',
  },
});


Dependencias 

  "dependencies": {
    "@react-native-community/geolocation": "^3.4.0",
    "@react-native-community/image-editor": "^4.3.0",
    "@react-native-vector-icons/common": "^11.0.0",
    "@react-native-vector-icons/fontawesome6": "^6.7.1",
    "@react-native-vector-icons/ionicons": "^7.4.0",
    "@react-navigation/bottom-tabs": "^7.2.0",
    "@react-navigation/drawer": "^7.1.1",
    "@react-navigation/native": "^7.0.14",
    "@react-navigation/native-stack": "^7.2.0",
    "@reeq/react-native-device-brightness": "^1.0.6",
    "@sentry/react": "^9.14.0",
    "@sentry/react-native": "^6.11.0",
    "@shopify/react-native-skia": "^2.0.0-next.3",
    "add": "^2.0.6",
    "axios": "^1.8.4",
    "camelcase-keys": "^9.1.3",
    "lottie-react-native": "^7.2.2",
    "react": "19.0.0",
    "react-native": "0.79.1",
    "react-native-blob-jsi-helper": "^0.3.1",
    "react-native-config": "^1.5.5",
    "react-native-fast-tflite": "^1.6.1",
    "react-native-gesture-handler": "^2.25.0",
    "react-native-reanimated": "^3.17.4",
    "react-native-safe-area-context": "^5.4.0",
    "react-native-safe-area-view": "^1.1.1",
    "react-native-screens": "^4.10.0",
    "react-native-svg": "^15.11.2",
    "react-native-vision-camera": "^4.6.4",
    "react-native-vision-camera-face-detector": "^1.8.2",
    "react-native-worklets-core": "^1.5.0",
    "snakecase-keys": "^8.0.1",
    "vision-camera-resize-plugin": "^3.2.0",
    "yarn": "^1.22.22",
    "zustand": "^5.0.3"
}

Relevant log output

Log not show issue

Camera Device

{
  "formats": [],
  "sensorOrientation": "landscape-right",
  "hardwareLevel": "full",
  "maxZoom": 10,
  "minZoom": 1,
  "maxExposure": 20,
  "supportsLowLightBoost": false,
  "neutralZoom": 1,
  "physicalDevices": [
    "wide-angle-camera"
  ],
  "supportsFocus": true,
  "supportsRawCapture": false,
  "isMultiCam": false,
  "minFocusDistance": 0,
  "minExposure": -20,
  "name": "1 (FRONT) androidx.camera.camera2",
  "hasFlash": false,
  "hasTorch": false,
  "position": "front",
  "id": "1"
}

Device

Galaxy A03s

VisionCamera Version

4.6.4

Can you reproduce this issue in the VisionCamera Example app?

I didn't try (āš ļø your issue might get ignored & closed if you don't try this)

Additional information

kevinzDzul avatar Jun 12 '25 05:06 kevinzDzul

Guten Tag, Hans here! šŸ»

Thanks for providing detailed information about your issue. It looks like you are experiencing a problem with the camera preview being too dark when using the frameProcessor. However, I noticed that there are no relevant logs included with your issue. Logs can provide crucial context for diagnosing the problem.

Please include logs from adb logcat when you reproduce the issue on your device. You can gather this by running the following command in your terminal:

adb logcat

If you can reproduce the issue in the VisionCamera Example app, that would be very helpful too! This can narrow down whether the problem is specific to your implementation.

Once you provide that additional information, mrousavy will be able to assist you better. Thank you for your understanding!

Note: If you think I made a mistake, please ping @mrousavy to take a look.

maintenance-hans[bot] avatar Jun 12 '25 05:06 maintenance-hans[bot]

Log not show issue

can't help without logs.

mrousavy avatar Jul 21 '25 12:07 mrousavy