react-native-vision-camera icon indicating copy to clipboard operation
react-native-vision-camera copied to clipboard

🐛 Recording video with audio often triggers `unknown/unknown` error

Open MathiasWP opened this issue 5 months ago • 11 comments

What's happening?

Before opening the camera i render a page where the user has to give access to both camera and microphone, so i'm not able to understand why this happens. The full error that gets thrown is the following:

Warning: [unknown/unknown]: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (561145187), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x15ecc17d0 {Error Domain=NSOSStatusErrorDomain Code=561145187 "(null)" UserInfo={AVErrorFourCharCode='!rec'}}} (caused by {"message":"Error Domain=AVFoundationErrorDomain Code=-11800 \"The operation could not be completed\" UserInfo={NSLocalizedFailureReason=An unknown error occurred (561145187), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x15ecc17d0 {Error Domain=NSOSStatusErrorDomain Code=561145187 \"(null)\" UserInfo={AVErrorFourCharCode='!rec'}}}","domain":"AVFoundationErrorDomain","details":{"NSUnderlyingError":null,"NSLocalizedDescription":"The operation could not be completed","NSLocalizedFailureReason":"An unknown error occurred (561145187)"},"code":-11800})
    at CameraView (<anonymous>)

My code is based on the example app from this library. If i set audio to false then the error will not be thrown. I have added these permissions to my info.plist:

	<key>NSCameraUsageDescription</key>
	<string>$(PRODUCT_NAME) needs access to your Camera.</string>
	<key>NSMicrophoneUsageDescription</key>
	<string>$(PRODUCT_NAME) needs access to your Microphone.</string>
	<key>NSPhotoLibraryUsageDescription</key>
	<string>Access your photo library</string>
	<key>NSPhotoLibraryAddUsageDescription</key>
	<string>We need access to save images to your photo library</string>

Reproduceable Code

/**
* Capture.tsx
*/
import * as React from 'react'
import { useRef, useState, useCallback, useMemo } from 'react'
import { StyleSheet, Text, View } from 'react-native'
import { Gesture, GestureDetector } from 'react-native-gesture-handler'
import type { CameraProps, CameraRuntimeError, PhotoFile, VideoFile } from 'react-native-vision-camera'
import {
  useCameraDevice,
  useCameraFormat,
  useMicrophonePermission,
} from 'react-native-vision-camera'
import { Camera } from 'react-native-vision-camera'
import { CONTENT_SPACING, CONTROL_BUTTON_SIZE, MAX_ZOOM_FACTOR, SAFE_AREA_PADDING, SCREEN_HEIGHT, SCREEN_WIDTH } from './Constants'
import Reanimated, { Extrapolation, interpolate, useAnimatedProps, useSharedValue, runOnJS } from 'react-native-reanimated'
import { useEffect } from 'react'
import { useIsForeground } from './hooks/useIsForeground'
import { CaptureButton } from './CaptureButton'
import { PressableOpacity } from 'react-native-pressable-opacity'
import { useIsFocused } from '@react-navigation/core'
import { usePreferredCameraDevice } from './hooks/usePreferredCameraDevice'
import { BoltIcon as BoltIconOutline } from 'react-native-heroicons/outline';
import { ArrowPathRoundedSquareIcon } from 'react-native-heroicons/outline';
import { BoltIcon as BoltIconSolid } from 'react-native-heroicons/solid';
import { MoonIcon as MoonIconOutline } from 'react-native-heroicons/outline';
import { useEditScreen } from './EditScreenContext';

const ReanimatedCamera = Reanimated.createAnimatedComponent(Camera)
Reanimated.addWhitelistedNativeProps({
  zoom: true,
})

const SCALE_FULL_ZOOM = 3

export default function Capture() {
  const camera = useRef<Camera>(null)
  const [isCameraInitialized, setIsCameraInitialized] = useState(false)
  const zoom = useSharedValue(1)
  const isPressingButton = useSharedValue(false)
  const [isTakingPhoto, setIsTakingPhoto] = useState(false)
  const isFocused = useIsFocused()
  const isForeground = useIsForeground()
  const { isInEditScreen } = useEditScreen()
  const [cameraPosition, setCameraPosition] = useState<'front' | 'back'>('back')
  const [enableHdr, setEnableHdr] = useState(false)
  const [flash, setFlash] = useState<'off' | 'on'>('off')
  const [enableNightMode, setEnableNightMode] = useState(false)
  const [targetFps, setTargetFps] = useState(30)
  const microphone = useMicrophonePermission()

  // Camera needs to stay active if we're using flash because it takes a while to take the photo
  const isActive = isFocused && isForeground && !isInEditScreen && (!isTakingPhoto || flash === 'on')
  // camera device settings
  const [preferredDevice] = usePreferredCameraDevice()
  let device = useCameraDevice(cameraPosition)

  if (preferredDevice != null && preferredDevice.position === cameraPosition) {
    // override default device with the one selected by the user in settings
    device = preferredDevice
  }

  const screenAspectRatio = SCREEN_HEIGHT / SCREEN_WIDTH
  const format = useCameraFormat(device, [
    { fps: targetFps },
    { videoAspectRatio: screenAspectRatio },
    { videoResolution: 'max' },
    { photoAspectRatio: screenAspectRatio },
    { photoResolution: 'max' },
  ])

  const fps = Math.min(format?.maxFps ?? 1, targetFps)

  const supportsFlash = device?.hasFlash ?? false
  const supportsHdr = format?.supportsPhotoHdr
  const supports60Fps = useMemo(() => device?.formats.some((f) => f.maxFps >= 60), [device?.formats])
  const canToggleNightMode = device?.supportsLowLightBoost ?? false

  //#region Animated Zoom
  const minZoom = device?.minZoom ?? 1
  const maxZoom = Math.min(device?.maxZoom ?? 1, MAX_ZOOM_FACTOR)

  const cameraAnimatedProps = useAnimatedProps<CameraProps>(() => {
    const z = Math.max(Math.min(zoom.value, maxZoom), minZoom)
    return {
      zoom: z,
    }
  }, [maxZoom, minZoom, zoom])
  //#endregion

  //#region Callbacks
  const setIsPressingButton = useCallback(
    (_isPressingButton: boolean) => {
      isPressingButton.value = _isPressingButton
    },
    [isPressingButton],
  )
  const onError = useCallback((error: CameraRuntimeError) => {
    console.error(error)
  }, [])
  
  const onInitialized = useCallback(() => {
    console.log('Camera initialized!')
    setIsCameraInitialized(true)
  }, [])

  const { setEditScreenData, setIsInEditScreen } = useEditScreen()
  
  const onMediaCaptured = useCallback(
    (media: PhotoFile | VideoFile, type: 'photo' | 'video') => {
      setIsTakingPhoto(false)
      setEditScreenData({
        path: media.path,
        type: type,
      })
      setIsInEditScreen(true)
    },
    [setEditScreenData, setIsInEditScreen],
  )
  const onFlipCameraPressed = useCallback(() => {
    setCameraPosition((p) => (p === 'back' ? 'front' : 'back'))
  }, [])
  const onFlashPressed = useCallback(() => {
    setFlash((f) => (f === 'off' ? 'on' : 'off'))
  }, [])
  //#endregion

  //#region Tap Gesture
  const onFocusTap = useCallback(
    (x: number, y: number) => {
      if (!device?.supportsFocus) return
      camera.current?.focus({
        x: x,
        y: y,
      })
    },
    [device?.supportsFocus],
  )
  //#endregion

  //#region Effects
  useEffect(() => {
    // Reset zoom to it's default everytime the `device` changes.
    zoom.value = device?.neutralZoom ?? 1
  }, [zoom, device])
  //#endregion

  //#region New Gesture System
  // Pinch gesture for zoom
  const pinchGesture = Gesture.Pinch()
    .onStart(() => {
      'worklet'
      // Store the starting zoom value in the gesture context
    })
    .onUpdate((event) => {
      'worklet'
      // Map the scale gesture to a linear zoom
      const startZoom = device?.neutralZoom ?? 1
      const scale = interpolate(
        event.scale,
        [1 - 1 / SCALE_FULL_ZOOM, 1, SCALE_FULL_ZOOM],
        [-1, 0, 1],
        Extrapolation.CLAMP
      )
      zoom.value = interpolate(
        scale,
        [-1, 0, 1],
        [minZoom, startZoom, maxZoom],
        Extrapolation.CLAMP
      )
    })
    .enabled(isActive)

  // Single tap gesture for focus
  const singleTapGesture = Gesture.Tap()
    .maxDuration(250)
    .onEnd((event) => {
      'worklet'
      if (device?.supportsFocus) {
        runOnJS(onFocusTap)(event.x, event.y)
      }
    })
    .enabled(isActive)

  // Double tap gesture for camera flip
  const doubleTapGesture = Gesture.Tap()
    .numberOfTaps(2)
    .maxDuration(250)
    .onEnd(() => {
      'worklet'
      runOnJS(onFlipCameraPressed)()
    })
    .enabled(isActive)

  // Compose gestures - double tap should block single tap
  const composedGestures = Gesture.Exclusive(
    doubleTapGesture,
    Gesture.Simultaneous(singleTapGesture, pinchGesture)
  )
  //#endregion

  useEffect(() => {
    const f =
      format != null
        ? `(${format.photoWidth}x${format.photoHeight} photo / ${format.videoWidth}x${format.videoHeight}@${format.maxFps} video @ ${fps}fps)`
        : undefined
    console.log(`Camera: ${device?.name} | Format: ${f}`)
  }, [device?.name, format, fps])

  const videoHdr = format?.supportsVideoHdr && enableHdr
  const photoHdr = format?.supportsPhotoHdr && enableHdr && !videoHdr

  return (
    <View style={styles.container}>
      {device != null ? (
        <View style={StyleSheet.absoluteFill}>
          <GestureDetector gesture={composedGestures}>
            <Reanimated.View style={StyleSheet.absoluteFill}>
              <ReanimatedCamera
                style={StyleSheet.absoluteFill}
                device={device}
                isActive={isActive}
                ref={camera}
                onInitialized={onInitialized}
                onError={onError}
                onStarted={() => console.log('Camera started!')}
                onStopped={() => console.log('Camera stopped!')}
                onOutputOrientationChanged={(o) => console.log(`Output orientation changed to ${o}!`)}
                onUIRotationChanged={(degrees) => console.log(`UI Rotation changed: ${degrees}°`)}
                format={format}
                fps={fps}
                photoHdr={photoHdr}
                videoHdr={videoHdr}
                photoQualityBalance="speed"
                lowLightBoost={device.supportsLowLightBoost && enableNightMode}
                enableZoomGesture={false}
                animatedProps={cameraAnimatedProps}
                exposure={0}
                enableFpsGraph={false}
                outputOrientation="device"
                photo={true}
                video={true}
                audio={microphone.hasPermission}
              />
            </Reanimated.View>
          </GestureDetector>
        </View>
      ) : (
        <View style={styles.emptyContainer}>
          <Text style={styles.text}>Your phone does not have a Camera.</Text>
        </View>
      )}

      <CaptureButton
        style={styles.captureButton}
        camera={camera}
        onMediaCaptured={onMediaCaptured}
        onTakingPhotoStarted={() => setIsTakingPhoto(true)}
        cameraZoom={zoom}
        minZoom={minZoom}
        maxZoom={maxZoom}
        flash={supportsFlash ? flash : 'off'}
        enabled={isCameraInitialized && isActive}
        setIsPressingButton={setIsPressingButton}
      />

      <View style={styles.rightButtonRow}>
        {supports60Fps && (
            <PressableOpacity style={styles.button} onPress={() => setTargetFps((t) => (t === 30 ? 60 : 30))}>
              <Text style={styles.text}>{`${targetFps}\nFPS`}</Text>
            </PressableOpacity>
          )}

        {supportsHdr && (
          <PressableOpacity style={styles.button} onPress={() => setEnableHdr((h) => !h)}>
            <Text style={styles.text}>{enableHdr ? 'hdr' : 'hdr-off'}</Text>
          </PressableOpacity>
        )}
        {canToggleNightMode && (
          <PressableOpacity style={styles.button} onPress={() => setEnableNightMode(!enableNightMode)} disabledOpacity={0.4}>
            <MoonIconOutline color="white" size={24} />
          </PressableOpacity>
        )}
        {supportsFlash && (
          <PressableOpacity style={styles.button} onPress={onFlashPressed} disabledOpacity={0.4}>
            {flash === 'on' ? (
              <BoltIconSolid color="white" size={24} />
            ) : (
              <BoltIconOutline color="white" size={24} />
            )}
          </PressableOpacity>
        )}
        <PressableOpacity style={styles.button} onPress={onFlipCameraPressed} disabledOpacity={0.4}>
          <ArrowPathRoundedSquareIcon color="white" size={24} />
        </PressableOpacity>
      </View>
    </View>
  )
}

const styles = StyleSheet.create({
  container: {
    flex: 1,
    backgroundColor: 'black',
  },
  captureButton: {
    position: 'absolute',
    alignSelf: 'center',
    bottom: SAFE_AREA_PADDING.paddingBottom + 30
  },
  button: {
    marginBottom: CONTENT_SPACING,
    width: CONTROL_BUTTON_SIZE,
    height: CONTROL_BUTTON_SIZE,
    borderRadius: CONTROL_BUTTON_SIZE / 2,
    backgroundColor: 'rgba(140, 140, 140, 0.3)',
    justifyContent: 'center',
    alignItems: 'center',
  },
  rightButtonRow: {
    position: 'absolute',
    right: SAFE_AREA_PADDING.paddingRight,
    top: SAFE_AREA_PADDING.paddingTop,
  },
  text: {
    color: 'white',
    fontSize: 11,
    fontWeight: 'bold',
    textAlign: 'center',
  },
  emptyContainer: {
    flex: 1,
    justifyContent: 'center',
    alignItems: 'center',
  },
})


/**
* CaptureButton.tsx
*/
import React, { RefObject, useCallback, useRef } from 'react'
import type { ViewProps } from 'react-native'
import { StyleSheet, View } from 'react-native'
import type { PanGestureHandlerGestureEvent, TapGestureHandlerStateChangeEvent } from 'react-native-gesture-handler'
import { PanGestureHandler, State, TapGestureHandler } from 'react-native-gesture-handler'
import Reanimated, {
  cancelAnimation,
  Easing,
  interpolate,
  useAnimatedStyle,
  withSpring,
  withTiming,
  useAnimatedGestureHandler,
  useSharedValue,
  withRepeat,
  Extrapolation,
} from 'react-native-reanimated'
import type { Camera, PhotoFile, VideoFile } from 'react-native-vision-camera'
import { CAPTURE_BUTTON_SIZE, SCREEN_HEIGHT, SCREEN_WIDTH } from './Constants'

const START_RECORDING_DELAY = 200
const BORDER_WIDTH = CAPTURE_BUTTON_SIZE * 0.05

interface Props extends ViewProps {
  camera: RefObject<Camera | null>
  onMediaCaptured: (media: PhotoFile | VideoFile, type: 'photo' | 'video') => void
  onTakingPhotoStarted: () => void
  minZoom: number
  maxZoom: number
  cameraZoom: Reanimated.SharedValue<number>

  flash: 'off' | 'on'

  enabled: boolean

  setIsPressingButton: (isPressingButton: boolean) => void
}

const _CaptureButton: React.FC<Props> = ({
  camera,
  onMediaCaptured,
  onTakingPhotoStarted,
  minZoom,
  maxZoom,
  cameraZoom,
  flash,
  enabled,
  setIsPressingButton,
  style,
  ...props
}): React.ReactElement => {
  const pressDownDate = useRef<Date | undefined>(undefined)
  const isRecording = useRef(false)
  const recordingProgress = useSharedValue(0)
  const isPressingButton = useSharedValue(false)
  const isRecordingVideo = useSharedValue(false)

  //#region Camera Capture
  const takePhoto = useCallback(async () => {
    try {
      if (camera.current == null) throw new Error('Camera ref is null!')
      onTakingPhotoStarted()
      const photo = await camera.current.takePhoto({
        flash,
        enableShutterSound: false,
      })
      onMediaCaptured(photo, 'photo')
    } catch (e) {
      console.error('Failed to take photo!', e)
    }
  }, [camera, flash, onMediaCaptured])

  const onStoppedRecording = useCallback(() => {
    isRecording.current = false
    isRecordingVideo.value = false
    cancelAnimation(recordingProgress)
    console.log('stopped recording video!')
  }, [recordingProgress, isRecordingVideo])
  const stopRecording = useCallback(async () => {
    try {
      if (camera.current == null) throw new Error('Camera ref is null!')

      console.log('calling stopRecording()...')
      await camera.current.stopRecording()
      console.log('called stopRecording()!')
    } catch (e) {
      console.error('failed to stop recording!', e)
    }
  }, [camera])

  const startRecording = useCallback(() => {
    try {
      if (camera.current == null) throw new Error('Camera ref is null!')
      console.log('calling startRecording()...')
      camera.current.startRecording({
        flash: flash,
        videoCodec: 'h265',
        onRecordingError: (error) => {
          console.error('Recording failed!', error)
          onStoppedRecording()
        },
        onRecordingFinished: (video) => {
          console.log(`Recording successfully finished! ${video.path}`)
          onMediaCaptured(video, 'video')
          onStoppedRecording()
        },
      })
      // TODO: wait until startRecording returns to actually find out if the recording has successfully started
      console.log('called startRecording()!')
      isRecording.current = true
      isRecordingVideo.value = true
    } catch (e) {
      console.error('failed to start recording!', e, 'camera')
    }
  }, [camera, flash, onMediaCaptured, onStoppedRecording])
  //#endregion

  //#region Tap handler
  const tapHandler = useRef<TapGestureHandler>(null)
  const onHandlerStateChanged = useCallback(
    async ({ nativeEvent: event }: TapGestureHandlerStateChangeEvent) => {
      // This is the gesture handler for the circular "shutter" button.
      // Once the finger touches the button (State.BEGAN), a photo is being taken and "capture mode" is entered. (disabled tab bar)
      // Also, we set `pressDownDate` to the time of the press down event, and start a 200ms timeout. If the `pressDownDate` hasn't changed
      // after the 200ms, the user is still holding down the "shutter" button. In that case, we start recording.
      //
      // Once the finger releases the button (State.END/FAILED/CANCELLED), we leave "capture mode" (enable tab bar) and check the `pressDownDate`,
      // if `pressDownDate` was less than 200ms ago, we know that the intention of the user is to take a photo. We check the `takePhotoPromise` if
      // there already is an ongoing (or already resolved) takePhoto() call (remember that we called takePhoto() when the user pressed down), and
      // if yes, use that. If no, we just try calling takePhoto() again
      console.debug(`state: ${Object.keys(State)[event.state]}`)
      switch (event.state) {
        case State.BEGAN: {
          // enter "recording mode"
          recordingProgress.value = 0
          isPressingButton.value = true
          const now = new Date()
          pressDownDate.current = now
          setTimeout(() => {
            if (pressDownDate.current === now) {
              // user is still pressing down after 200ms, so his intention is to create a video
              startRecording()
            }
          }, START_RECORDING_DELAY)
          setIsPressingButton(true)
          return
        }
        case State.END:
        case State.FAILED:
        case State.CANCELLED: {
          // exit "recording mode"
          try {
            if (pressDownDate.current == null) throw new Error('PressDownDate ref .current was null!')
            const now = new Date()
            const diff = now.getTime() - pressDownDate.current.getTime()
            pressDownDate.current = undefined
            if (diff < START_RECORDING_DELAY) {
              // user has released the button within 200ms, so his intention is to take a single picture.
              await takePhoto()
            } else {
              // user has held the button for more than 200ms, so he has been recording this entire time.
              await stopRecording()
            }
          } finally {
            setTimeout(() => {
              isPressingButton.value = false
              setIsPressingButton(false)
            }, 500)
          }
          return
        }
        default:
          break
      }
    },
    [isPressingButton, recordingProgress, setIsPressingButton, startRecording, stopRecording, takePhoto],
  )
  //#endregion
  //#region Pan handler
  const panHandler = useRef<PanGestureHandler>(null)
  const onPanGestureEvent = useAnimatedGestureHandler<PanGestureHandlerGestureEvent, { offsetY?: number; startY?: number }>({
    onStart: (event, context) => {
      context.startY = event.absoluteY
      // Increase drag distance by using a much smaller multiplier (0.1 instead of 0.7)
      // This means you need to drag much further to reach full zoom
      const yForFullZoom = context.startY * 0.1
      const offsetYForFullZoom = context.startY - yForFullZoom

      // extrapolate [0 ... 1] zoom -> [0 ... Y_FOR_FULL_ZOOM] finger position
      context.offsetY = interpolate(cameraZoom.value, [minZoom, maxZoom], [0, offsetYForFullZoom], Extrapolation.CLAMP)
    },
    onActive: (event, context) => {
      const offset = context.offsetY ?? 0
      const startY = context.startY ?? SCREEN_HEIGHT
      // Use the same multiplier for consistency
      const yForFullZoom = startY * 0.1

      cameraZoom.value = interpolate(event.absoluteY - offset, [yForFullZoom, startY], [maxZoom, minZoom], Extrapolation.CLAMP)
    },
  })
  //#endregion

  const shadowStyle = useAnimatedStyle(
    () => ({
      transform: [
        {
          scale: withSpring(isRecordingVideo.value ? 1 : 0, {
            mass: 1,
            damping: 35,
            stiffness: 300,
          }),
        },
      ],
    }),
    [isRecordingVideo],
  )
  const buttonStyle = useAnimatedStyle(() => {
    let scale: number
    if (enabled) {
      if (isPressingButton.value) {
        scale = withRepeat(
          withSpring(1, {
            stiffness: 100,
            damping: 1000,
          }),
          -1,
          true,
        )
      } else {
        scale = withSpring(0.9, {
          stiffness: 500,
          damping: 300,
        })
      }
    } else {
      scale = withSpring(0.6, {
        stiffness: 500,
        damping: 300,
      })
    }

    return {
      opacity: withTiming(enabled ? 1 : 0.3, {
        duration: 100,
        easing: Easing.linear,
      }),
      transform: [
        {
          scale: scale,
        },
      ],
    }
  }, [enabled, isPressingButton])

  return (
    <TapGestureHandler
      enabled={enabled}
      ref={tapHandler}
      onHandlerStateChange={onHandlerStateChanged}
      shouldCancelWhenOutside={false}
      maxDurationMs={99999999} // <-- this prevents the TapGestureHandler from going to State.FAILED when the user moves his finger outside of the child view (to zoom)
      simultaneousHandlers={panHandler}>
      <Reanimated.View {...props} style={[buttonStyle, style]}>
        <PanGestureHandler
          enabled={enabled}
          ref={panHandler}
          failOffsetX={[-SCREEN_WIDTH, SCREEN_WIDTH]}
          activeOffsetY={[-2, 2]}
          onGestureEvent={onPanGestureEvent}
          simultaneousHandlers={tapHandler}>
          <Reanimated.View style={styles.flex}>
            <Reanimated.View style={[styles.shadow, shadowStyle]} />
            <View style={styles.button} />
          </Reanimated.View>
        </PanGestureHandler>
      </Reanimated.View>
    </TapGestureHandler>
  )
}

export const CaptureButton = React.memo(_CaptureButton)

const styles = StyleSheet.create({
  flex: {
    flex: 1,
  },
  shadow: {
    position: 'absolute',
    width: CAPTURE_BUTTON_SIZE,
    height: CAPTURE_BUTTON_SIZE,
    borderRadius: CAPTURE_BUTTON_SIZE / 2,
    backgroundColor: '#e34077',
  },
  button: {
    width: CAPTURE_BUTTON_SIZE,
    height: CAPTURE_BUTTON_SIZE,
    borderRadius: CAPTURE_BUTTON_SIZE / 2,
    borderWidth: BORDER_WIDTH,
    borderColor: 'white',
  },
})

Relevant log output

The same error as I pasted above gets logged. It happens before state becomes ACTIVE and `stopRecording` is called.

Here are more logs:

VisionCamera.initializeVideoTrack(withSettings:): Initialized Video AssetWriter.
15:18:10.449: [info] 📸 VisionCamera.start(): Starting Asset Writer...
15:18:10.478: [info] 📸 VisionCamera.start(): Asset Writer started!
15:18:10.478: [info] 📸 VisionCamera.start(): Asset Writer session started at 91915.189441083.
15:18:10.478: [info] 📸 VisionCamera.start(): Requesting video timeline start at 91915.189626875...
15:18:10.478: [info] 📸 VisionCamera.start(): Requesting audio timeline start at 91915.189667958...
15:18:10.478: [info] 📸 VisionCamera.startRecording(options:onVideoRecorded:onError:): RecordingSesssion started in 48.105375ms!
15:18:10.479: [info] 📸 VisionCamera.activateAudioSession(): Audio Session activated!
15:18:10.503: [info] 📸 VisionCamera.isTimestampWithinTimeline(timestamp:): video Timeline: First timestamp: 91915.179529625
15:18:10.608: [error] 📸 VisionCamera.sessionRuntimeError(notification:): Unexpected Camera Runtime Error occured!
15:18:10.608: [error] 📸 VisionCamera.onError(_:): Invoking onError(): Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (561145187), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x169349c20 {Error Domain=NSOSStatusErrorDomain Code=

Camera Device

{
  "formats": [],
  "isMultiCam": false,
  "supportsFocus": true,
  "physicalDevices": [
    "wide-angle-camera"
  ],
  "hardwareLevel": "full",
  "supportsLowLightBoost": false,
  "hasTorch": true,
  "supportsRawCapture": false,
  "maxExposure": 8,
  "id": "com.apple.avfoundation.avcapturedevice.built-in_video:0",
  "neutralZoom": 1,
  "minFocusDistance": 12,
  "hasFlash": true,
  "name": "Back Camera",
  "maxZoom": 123.75,
  "minExposure": -8,
  "sensorOrientation": "portrait",
  "minZoom": 1,
  "position": "back"
}

Device

iPhone 12 mini - IOS 18.6

VisionCamera Version

4.7.0

Can you reproduce this issue in the VisionCamera Example app?

Yes. I copied over the code base into my app and the error was still triggered.

Additional information

MathiasWP avatar Jul 17 '25 13:07 MathiasWP

Guten Tag, Hans here 🍻

I see you have quite a detailed report, which is good! However, it looks like you did not check if this issue can be reproduced in the VisionCamera Example app. Please try that and let us know if you encounter the same error. If you still experience the issue, it would also be helpful for mrousavy if you could provide further details like the iOS logs directly from Xcode. This will help us diagnose the issue much faster! If you need guidance on gathering those logs, just ask!

Danke!

Note: If you think I made a mistake, please ping @mrousavy to take a look.

maintenance-hans[bot] avatar Jul 17 '25 13:07 maintenance-hans[bot]

Extra information:

I'm making a snapchat like app, and after the user has recorded i play the video using react-native-video. While this video is playing i still have the Camera app mounted in a stack below while setting active to false, since this is recommended for better performance. Is it possible that there are some conflicts with audio here? Setting muted=true on the react-native-video component still triggers the error, but if i completely remove the Video renderer, then i am not able to trigger this error.

It seems like react-native-video and react-native-vision-camera is battling against something, which causes recording video with audio to not work when unmounting the Video component from react-native-video.

MathiasWP avatar Jul 17 '25 14:07 MathiasWP

I tried the example app now, because i noticed that the react-native-video library is used there as well. I was thinking that maybe the navigation setup in the example app fixed the issue - but nope. I managed to trigger that error there as well. Note that i used the exact same source code, but had all dependencies updated to the latest version.

MathiasWP avatar Jul 17 '25 15:07 MathiasWP

It looks like you're experiencing the same issue: https://github.com/mrousavy/react-native-vision-camera/issues/3560

I ran into this as well while using react-native-vision-camera together with react-native-video. I was able to resolve it by adding the following prop:

<Video disableAudioSessionManagement={true}

Hope this helps!

longnguyenegs avatar Jul 18 '25 06:07 longnguyenegs

It looks like you're experiencing the same issue: #3560

I ran into this as well while using react-native-vision-camera together with react-native-video. I was able to resolve it by adding the following prop:

<Video disableAudioSessionManagement={true}

Hope this helps!

What are your "react-native-vision-camera" and "react-native-video" versions ?

cloud-github avatar Jul 25 '25 08:07 cloud-github

It looks like you're experiencing the same issue: #3560 I ran into this as well while using react-native-vision-camera together with react-native-video. I was able to resolve it by adding the following prop:

<Video disableAudioSessionManagement={true}

Hope this helps!

What are your "react-native-vision-camera" and "react-native-video" versions ?

@cloud-github : "react-native-video": "6.15.0", "react-native-vision-camera": "4.6.4",

longnguyenegs avatar Jul 25 '25 09:07 longnguyenegs

It looks like you're experiencing the same issue: #3560 I ran into this as well while using react-native-vision-camera together with react-native-video. I was able to resolve it by adding the following prop:

<Video disableAudioSessionManagement={true}

Hope this helps!

What are your "react-native-vision-camera" and "react-native-video" versions ?

@cloud-github : "react-native-video": "6.15.0", "react-native-vision-camera": "4.6.4",

Sorry, that combination is not working for me.

"dependencies": { "@react-navigation/native": "^7.0.15", "@react-navigation/native-stack": "^7.3.21", "@react-navigation/stack": "^7.1.2", "react": "18.3.1", "react-native": "0.76.0", "react-native-gesture-handler": "^2.24.0", "react-native-safe-area-context": "^5.3.0", "react-native-screens": "4.4.0", "react-native-video": "6.15.0", "react-native-vision-camera": "4.6.4", "react-native-webview": "^13.13.2", "rn-fetch-blob": "^0.12.0", "uuid": "^11.1.0" },

Camera.js

`import React, { useState, useRef, useEffect } from 'react'; import { View, Text, TouchableOpacity, StyleSheet, Alert, Dimensions, ActivityIndicator, Linking, } from 'react-native'; import { Camera, useCameraDevice, useCameraPermission, useMicrophonePermission, } from 'react-native-vision-camera';

const { width, height } = Dimensions.get('window');

const CameraComponent = ({ onVideoRecorded }) => { const [isRecording, setIsRecording] = useState(false); const [recordedVideoPath, setRecordedVideoPath] = useState(null); const [isCameraInitialized, setIsCameraInitialized] = useState(false); const cameraRef = useRef(null);

// Use the latest API - useCameraDevice instead of useCameraDevices const backDevice = useCameraDevice('back'); const frontDevice = useCameraDevice('front');

// Select back camera first, fallback to front const device = backDevice || frontDevice;

const { hasPermission: hasCameraPermission, requestPermission: requestCameraPermission } = useCameraPermission(); const { hasPermission: hasMicrophonePermission, requestPermission: requestMicrophonePermission } = useMicrophonePermission();

useEffect(() => { checkPermissions(); }, []);

// Debug device information useEffect(() => { console.log('Back device:', backDevice); console.log('Front device:', frontDevice); console.log('Selected device:', device);

if (device) {
  console.log('Device details:', {
    id: device.id,
    name: device.name,
    position: device.position,
    hasFlash: device.hasFlash,
    hasTorch: device.hasTorch,
    supportsRawCapture: device.supportsRawCapture,
    supportsLowLightBoost: device.supportsLowLightBoost,
  });
}

}, [backDevice, frontDevice, device]);

const checkPermissions = async () => { try { let cameraGranted = hasCameraPermission; let micGranted = hasMicrophonePermission;

  if (!hasCameraPermission) {
    console.log('Requesting camera permission...');
    cameraGranted = await requestCameraPermission();
    console.log('Camera permission granted:', cameraGranted);
  }
  
  if (!hasMicrophonePermission) {
    console.log('Requesting microphone permission...');
    micGranted = await requestMicrophonePermission();
    console.log('Microphone permission granted:', micGranted);
  }
  
  if (cameraGranted && micGranted) {
    console.log('All permissions granted, camera ready!');
    setIsCameraInitialized(true);
  } else {
    Alert.alert(
      'Permissions Required',
      'Camera and microphone permissions are needed to use this feature',
      [
        {
          text: 'Open Settings',
          onPress: () => Linking.openSettings(),
        },
        {
          text: 'Cancel',
          style: 'cancel',
        },
      ]
    );
  }
} catch (error) {
  console.error('Permission check error:', error);
  Alert.alert('Error', 'Failed to check permissions');
}

};

const startRecording = async () => { try { if (cameraRef.current && !isRecording && device) { console.log('Starting recording with device:', device.name); setIsRecording(true);

    await cameraRef.current.startRecording({
      flash: 'off',
      onRecordingFinished: (video) => {
        console.log('Recording finished:', video.path);
        setRecordedVideoPath(video.path);
        setIsRecording(false);
        
        if (onVideoRecorded) {
          onVideoRecorded(video.path);
        }
      },
      onRecordingError: (error) => {
        console.error('Recording error:', error);
        setIsRecording(false);
        Alert.alert('Recording Error', error.message || 'Unknown recording error');
      },
    });
  }
} catch (error) {
  console.error('Start recording error:', error);
  setIsRecording(false);
  Alert.alert('Error', `Failed to start recording: ${error.message}`);
}

};

const stopRecording = async () => { try { if (cameraRef.current && isRecording) { console.log('Stopping recording...'); await cameraRef.current.stopRecording(); } } catch (error) { console.error('Stop recording error:', error); Alert.alert('Error', Failed to stop recording: ${error.message}); } };

const toggleRecording = () => { if (isRecording) { stopRecording(); } else { startRecording(); } };

const onCameraInitialized = () => { console.log('Camera initialized successfully'); };

const onCameraError = (error) => { console.error('Camera error:', error); Alert.alert('Camera Error', error.message); };

// Check permissions if (!hasCameraPermission || !hasMicrophonePermission) { return ( <View style={styles.permissionContainer}> <Text style={styles.permissionText}> Camera and microphone permissions are required </Text> <TouchableOpacity style={styles.permissionButton} onPress={checkPermissions}> <Text style={styles.permissionButtonText}>Grant Permissions</Text> </TouchableOpacity> </View> ); }

// Check if device is available if (!device) { return ( <View style={styles.permissionContainer}> <Text style={styles.permissionText}>No camera device available</Text> <Text style={[styles.permissionText, {fontSize: 14, marginTop: 10}]}> Back camera: {backDevice ? 'Available' : 'Not found'} </Text> <Text style={[styles.permissionText, {fontSize: 14, marginTop: 5}]}> Front camera: {frontDevice ? 'Available' : 'Not found'} </Text> <TouchableOpacity style={styles.permissionButton} onPress={() => { // Force re-render to check devices again setIsCameraInitialized(false); setTimeout(() => checkPermissions(), 100); }} > <Text style={styles.permissionButtonText}>Retry</Text> </TouchableOpacity> </View> ); }

// Loading state if (!isCameraInitialized) { return ( <View style={styles.permissionContainer}> <ActivityIndicator size="large" color="white" /> <Text style={styles.permissionText}>Initializing camera...</Text> <Text style={[styles.permissionText, {fontSize: 12, marginTop: 10}]}> Using: {device.name} ({device.position}) </Text> </View> ); }

return ( <View style={styles.container}> <Camera ref={cameraRef} style={styles.camera} device={device} isActive={true} video={true} audio={true} enableZoomGesture={true} onInitialized={onCameraInitialized} onError={onCameraError} />

  <View style={styles.controlsContainer}>
    <TouchableOpacity
      style={[
        styles.recordButton,
        isRecording ? styles.recordingButton : styles.idleButton
      ]}
      onPress={toggleRecording}
      disabled={!device}
    >
      <View style={[
        styles.recordButtonInner,
        isRecording ? styles.recordingInner : styles.idleInner
      ]}>
        {isRecording && <Text style={styles.recordingText}>REC</Text>}
      </View>
    </TouchableOpacity>
    
    <Text style={styles.statusText}>
      {isRecording ? 'Recording...' : 'Tap to record'}
    </Text>
    
    <Text style={styles.deviceText}>
      {device.name} ({device.position})
    </Text>
    
    {recordedVideoPath && (
      <Text style={styles.pathText} numberOfLines={2}>
        Last recorded: {recordedVideoPath}
      </Text>
    )}
  </View>
</View>

); };

const styles = StyleSheet.create({ container: { flex: 1, backgroundColor: 'black', }, camera: { flex: 1, }, permissionContainer: { flex: 1, justifyContent: 'center', alignItems: 'center', backgroundColor: 'black', padding: 20, }, permissionText: { color: 'white', fontSize: 16, textAlign: 'center', marginBottom: 20, }, permissionButton: { backgroundColor: '#007AFF', paddingHorizontal: 20, paddingVertical: 10, borderRadius: 8, }, permissionButtonText: { color: 'white', fontSize: 16, fontWeight: 'bold', }, controlsContainer: { position: 'absolute', bottom: 50, left: 0, right: 0, alignItems: 'center', paddingHorizontal: 20, }, recordButton: { width: 80, height: 80, borderRadius: 40, justifyContent: 'center', alignItems: 'center', marginBottom: 20, }, idleButton: { backgroundColor: 'rgba(255, 255, 255, 0.3)', borderWidth: 3, borderColor: 'white', }, recordingButton: { backgroundColor: 'rgba(255, 0, 0, 0.3)', borderWidth: 3, borderColor: 'red', }, recordButtonInner: { width: 60, height: 60, borderRadius: 30, justifyContent: 'center', alignItems: 'center', }, idleInner: { backgroundColor: 'white', }, recordingInner: { backgroundColor: 'red', }, recordingText: { color: 'white', fontSize: 12, fontWeight: 'bold', }, statusText: { color: 'white', fontSize: 16, fontWeight: 'bold', marginBottom: 10, }, deviceText: { color: 'rgba(255, 255, 255, 0.8)', fontSize: 12, marginBottom: 10, }, pathText: { color: 'rgba(255, 255, 255, 0.7)', fontSize: 12, textAlign: 'center', }, });

export default CameraComponent;`

Player.js

`import React, { useState, useRef } from 'react'; import { View, TouchableOpacity, Text, StyleSheet } from 'react-native'; import Video from 'react-native-video';

const VideoPlayer = ({ videoPath, onBackToCamera }) => { const [paused, setPaused] = useState(false); const videoRef = useRef(null);

// Format the video source properly for iOS local files const getVideoSource = () => { if (!videoPath) { console.log('No video path provided'); return null; }

if (videoPath.startsWith('file://')) {
  // For iOS, sometimes the file:// URI works as-is
  return { uri: videoPath };
} else if (videoPath.startsWith('/')) {
  // If it's just a path, add file:// prefix
  return { uri: `file://${videoPath}` };
}
return { uri: videoPath };

};

const handleBackToCamera = () => { onBackToCamera(); };

const togglePlayPause = () => { setPaused(!paused); };

const onBuffer = (buffer) => { console.log('Buffering:', buffer); };

const onError = (error) => { console.log('Video Error:', error); };

const videoSource = getVideoSource();

if (!videoSource) { return ( <View style={styles.container}> <View style={styles.controls}> <TouchableOpacity style={styles.button} onPress={handleBackToCamera}> <Text style={styles.buttonText}>Back</Text> </TouchableOpacity> </View> <Text style={styles.errorText}>No video to display</Text> </View> ); }

return ( <View style={styles.container}> <Video disableAudioSessionManagement={true} source={videoSource} ref={videoRef} style={styles.video} paused={paused} resizeMode="contain" repeat={false} controls={false} playInBackground={false} playWhenInactive={false} onBuffer={onBuffer} onError={onError} onLoad={(data) => console.log('Video loaded:', data)} onLoadStart={() => console.log('Video load started')} onReadyForDisplay={() => console.log('Video ready for display')} />

  <View style={styles.controls}>
    <TouchableOpacity style={styles.button} onPress={handleBackToCamera}>
      <Text style={styles.buttonText}>Back</Text>
    </TouchableOpacity>
    
    <TouchableOpacity style={styles.button} onPress={togglePlayPause}>
      <Text style={styles.buttonText}>{paused ? 'Play' : 'Pause'}</Text>
    </TouchableOpacity>
  </View>
</View>

); };

const styles = StyleSheet.create({ container: { flex: 1, backgroundColor: 'black', }, // This closing brace was missing errorText: { color: 'white', fontSize: 16, textAlign: 'center', marginTop: 50, }, video: { flex: 1, }, controls: { position: 'absolute', bottom: 50, left: 0, right: 0, flexDirection: 'row', justifyContent: 'space-around', paddingHorizontal: 20, }, button: { backgroundColor: 'rgba(0, 0, 0, 0.7)', paddingHorizontal: 20, paddingVertical: 10, borderRadius: 5, }, buttonText: { color: 'white', fontSize: 16, fontWeight: 'bold', }, });

export default VideoPlayer;`

Recording and playback work correctly the first time. However, when attempting to record a second time, the recording does not function as expected. and throws this Camera error,

Camera error: [unknown/unknown: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-10868), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x280cdb2a0 {Error Domain=NSOSStatusErrorDomain Code=-10868 "(null)"}}]

I would greatly appreciate your assistance. Thank you very much!

cloud-github avatar Jul 25 '25 10:07 cloud-github

I've been trying to resolve this issue for a while and haven't found a satisfactory solution, unfortunately. But let me leave some notes, as I think this might be helpful for someone

  • This issue can be reproduced by these steps: record a video, playback the video, then record again.
  • disableAudioSessionManagement didn't help while I tried different versions of react-native-video.
  • I was surprised that I didn't notice this until now, but my current implementation doesn't unmount the video component when it's hidden. Because my video component was placed at the root of the Stack navigator of React Navigation. That keeps the video playing in the background and sounds quite relatable to this issue. Check your implementation if you see this behavior, too.
  • Migrating to expo-video solved this to some extent in my case. I saw this issue even with expo-video once, but I couldn't reproduce it except that.

Sidebook avatar Jul 26 '25 08:07 Sidebook

it ain't much, but it's honest work

maintenance-hans[bot] avatar Aug 14 '25 07:08 maintenance-hans[bot]

So after some debugging I also came across this issue. Then saw that react-native-video did a update regarding audio management. For me applying this patch in react-native-video worked so Vision Camera did not throw the above error anymore.

https://github.com/TheWidlarzGroup/react-native-video/commit/d2c92a1f3f6579aa6607389de8e51e4b75012f3b

mikebouwmans avatar Aug 27 '25 15:08 mikebouwmans

@mikebouwmans When I record a video, this error appears. What does react-native-video have to do with it?

Abdullo-0901 avatar Oct 17 '25 13:10 Abdullo-0901