react-native-vision-camera icon indicating copy to clipboard operation
react-native-vision-camera copied to clipboard

πŸ› Frame size in frame processor is different from that in the recorded video file in Android

Open meleffendi opened this issue 9 months ago β€’ 2 comments

What's happening?

I'm using a frame processor and recording a video file and I noticed that the frame size and aspect ratio reported in a frame processor is different from that in the recording. I would expect the frame processor and the recorded video to have the same resolution or the same aspect ratio. Because according to the documentation:

videoWidth/VideoHeight: The resolution that will be used for recording videos and streaming into frame processors. This also affects the preview's aspect ratio. Choose a format with your desired resolution.

I know that the camera will try to match what I request in the format and if it couldn't then it will use the closest possible format. But shouldn't it use the same frame size in both the frame processor and the recording if it couldn't match what I requested? Or at least the same aspect ratio?

Here's what I tried in the example app:

const format = useCameraFormat(device, [
    { fps: 30 },
    { videoResolution: { width: 640, height: 480 } },
  ])

This gave 640x480(4:3) in the frame processor but the recording was 720x480(3:2)

const format = useCameraFormat(device, [
    { fps: 30 },
    { videoResolution: { width: 640, height: 480 } },
    { videoAspectRatio: 640 / 480 },
  ])

(same as above)

This presents an issue in my application because I collect certain image coordinates in the frame processor for later processing. When I load the video and try to render the points on the video, the points are away from where they should be. I could simply scale the coordinates, yes, but scaling works if the aspect ratio of the frame processor and the video file are the same i.e. both 4:3 or both 3:2. Otherwise, the frame processor would be seeing a different frame from what is being recorded.

I tried not passing any format to the camera in the example app: Same as above

I tried not passing any format to the camera in my app: This gave 640x480 in the frame processor and 1920x1080 in the recording which is actually great because my frame processor runs faster and I can still record with good quality. Problem is that I cannot guarantee this will be the behavior on all devices.

I also tried this in the example app:

const format = useCameraFormat(device, [
    { fps: 30 },
    { videoResolution: { width: 1920, height: 1080 } },
    { videoAspectRatio: 1920 / 1080 },
  ])

this gave 1920x1080 in the frame processor and 1280x720 in the recording.

Using this format in my app, the frame processor and the recording were the same size.

So in short, if the camera was not able to match the requested video resolution or aspect ratio, I have no way of knowing until the video is done recording.

Reproduceable Code

//case 1:
const format = useCameraFormat(device, [
    { fps: 30 },
    { videoResolution: { width: 640, height: 480 } },
  ])


//case 2:
const format = useCameraFormat(device, [
    { fps: 30 },
    { videoResolution: { width: 640, height: 480 } },
    { videoAspectRatio: 640 / 480 },
  ])

//case 3:
const format = useCameraFormat(device, [
    { fps: 30 },
    { videoResolution: { width: 1920, height: 1080 } },
    { videoAspectRatio: 1920 / 1080 },
  ])

Relevant log output

case 1:
355381686493737: 640x480 yuv Frame (landscape-right)
media has loaded.
Video loaded. Size: 480x720 (portrait, 1.165 seconds)

case 2:
355441692040737: 640x480 yuv Frame (landscape-right)
 LOG  media has loaded.
 LOG  Video loaded. Size: 480x720 (portrait, 0.899 seconds)

case 3:
355489995507737: 1920x1080 yuv Frame (landscape-right)
 LOG  media has loaded.
 LOG  Video loaded. Size: 720x1280 (portrait, 0.967 seconds)

Camera Device

{
  "formats": [],
  "sensorOrientation": "landscape-left",
  "hardwareLevel": "full",
  "maxZoom": 8,
  "minZoom": 1,
  "maxExposure": 12,
  "supportsLowLightBoost": false,
  "neutralZoom": 1,
  "physicalDevices": [
    "wide-angle-camera"
  ],
  "supportsFocus": true,
  "supportsRawCapture": false,
  "isMultiCam": false,
  "minFocusDistance": 10,
  "minExposure": -12,
  "name": "0 (BACK) androidx.camera.camera2",
  "hasFlash": true,
  "hasTorch": true,
  "position": "back",
  "id": "0"
}

Device

OnePlus5

VisionCamera Version

4.5.2

Can you reproduce this issue in the VisionCamera Example app?

Yes, I can reproduce the same issue in the Example app here

Additional information

meleffendi avatar Feb 21 '25 04:02 meleffendi

Guten Tag, Hans here 🍻

Thanks for your detailed report! It seems like you're encountering a discrepancy between ze frame size in ze frame processor and ze recorded video resolution. This can definitely be a bit tricky.

From your description and code, it appears you've worked hard to identify ze issue and provided good logging information. I recommend checking if ze device supports ze resolution you are requesting consistently. Some devices might behave differently based on their camera hardware capabilities.

If you can, please check if this happens on other devices as well. If it’s just on ze OnePlus5, it might be a device-specific quirk.

Feel free to update us with any new findings. In ze meantime, if you find this project helpful, consider sponsoring mrousavy to help him maintain it! Become a sponsor 😊

Note: If you think I made a mistake, please ping @mrousavy to take a look.

maintenance-hans[bot] avatar Feb 21 '25 04:02 maintenance-hans[bot]

@mrousavy I've recently discovered a similar phenomenon exists for the Samsung S23 FE. CameraFormat was filtered to be 2160x3840, 30fps and that format was confirmed in the logs, but the resolution of the resulting video was only 720x1280. Disabling the frameProcessor by commenting out the frameProcessor CameraProp restored the video resolution to 2160x3840. Have not had the same issue with iOS, resolution for the frameProcessor and the video are consistent at 2160x3840.

titanium-cranium avatar Jun 11 '25 08:06 titanium-cranium

The problem with lower video resolution can be fixed by updating camerax_version in build.gradle from 1.5.0-alpha03 to 1.5.1 - https://developer.android.com/jetpack/androidx/releases/camera#1.5.1 - Fixed an issue that prevented Preview from selecting 16:9 resolutions and VideoCapture from recording at QUALITY_1080P

ievgenmukhin avatar Oct 21 '25 15:10 ievgenmukhin

We’re running into a strange issue when using frame processing. When recording a video, the result normally follows our specified format:

const format = useCameraFormat(device, [

{ photoAspectRatio: 16 / 9 },

{ videoAspectRatio: 16 / 9 },

{ videoResolution: { width: 1920, height: 1080 } },

{ photoResolution: { width: 1920, height: 1080 } },

{ fps: 30 } // Reduced to 30fps for better stability and less motion blur ]);

However, when frame processing is enabled, the recorded video ends up being 1080Γ—720, which is not what we want. We need the original 1920Γ—1080 resolution because our model training depends on it.

Schabaani avatar Nov 09 '25 11:11 Schabaani

@Schabaani Try this and see if it works for you. I set the photoResolution as well and it seemed to solve the problem.

  const screenAspectRatio = SCREEN_HEIGHT / SCREEN_WIDTH

  const imageResolution = {width: 3840, height: 2160} 

  // All of these parameters are required to specify a format that returns 3840x2160 video at 30 FPS in the frame processor on android
  const format = useCameraFormat(device, [
    { fps: fps },
    { videoAspectRatio: screenAspectRatio },
    { videoResolution: imageResolution },
    { photoAspectRatio: screenAspectRatio },
    { photoResolution: 'max' },
  ])

titanium-cranium avatar Nov 09 '25 23:11 titanium-cranium