🐛 Frame toArrayBuffer output is the wrong size. Missing depth
What's happening?
Reading the pixel data for my camera frame is producing a uint8Array with only a single byte per pixel
width=640, height=480, pixelFormat=rgb, orientation=landscape-right, bytesPerRow=2560
const buffer = frame.toArrayBuffer();
const data = new Uint8Array(buffer);
data.length is 307200 which is exactly 640 * 480 indicating only 1 byte per pixel (grayscale?). But if the pixelFormat were actually RGBA it would be 640 * 480 * 4.
The bytesPerRow seems to be correct, bytesPerRow(2560) === width * 4 but it doesn't match the array length. Am I doing something wrong? I'd like to grab pixels from the edges of the screen but the indices keep going out of bounds.
Reproduceable Code
const device = useCameraDevice(CameraType.Back);
const frameProcessor = useFrameProcessor(frame => {
"worklet";
try {
runAtTargetFps(4, () => {
const { width, height, orientation, bytesPerRow, pixelFormat } = frame;
const buffer = frame.toArrayBuffer();
const data = new Uint8Array(buffer);
const bytesPerPixel = 4; // 3 also seems wrong
const getPixel = (data, x, y, bytesPerRow, bytesPerPixel) => {
const index = y * bytesPerRow + x * bytesPerPixel;
return {
r: data[index],
g: data[index + 1],
b: data[index + 2],
a: data[index + 3],
};
};
// Top-Left Corner
const topLeftPixel = getPixel(data, 0, 0, bytesPerRow, bytesPerPixel);
// Top-Right Corner
const topRightPixel = getPixel(data, width - 1, 0, bytesPerRow, bytesPerPixel);
// Bottom-Left Corner
const bottomLeftPixel = getPixel(data, 0, height - 1, bytesPerRow, bytesPerPixel);
// Bottom-Right Corner
const bottomRightPixel = getPixel(data, width - 1, height - 1, bytesPerRow, bytesPerPixel);
} catch (err) {
console.log("frameProcessor err", err, (err as Error).message);
}
}, []);
<Camera
style={StyleSheet.absoluteFill}
photo
video
audio={false}
device={device}
pixelFormat="rgb"
isActive={isFocused && isForeground}
ref={cameraRef}
onInitialized={handleCameraInit}
onError={handleCameraError}
onStarted={() => "Camera started!"}
onStopped={() => "Camera stopped!"}
outputOrientation="preview"
androidPreviewViewType="surface-view"
zoom={settings.zoom}
exposure={Number(settings.exposure)}
photoQualityBalance="balanced"
frameProcessor={frameProcessor}
/>
Relevant log output
LOG Frame dimensions: width=640, height=480, bytesPerRow=2560
LOG Top-Left Pixel: 14, 12, 15, A=255
LOG Top-Right Pixel: R=37, G=32, B=29, A=255
LOG Index out of bounds: 1226240
LOG Bottom-Left Pixel: R=undefined, G=undefined, B=undefined, A=undefined
LOG Index out of bounds: 1228796
LOG Bottom-Right Pixel: R=undefined, G=undefined, B=undefined, A=undefined
Camera Device
{
"formats": [],
"hardwareLevel": "full",
"hasFlash": true,
"hasTorch": true,
"id": "0",
"isMultiCam": false,
"maxExposure": 20,
"maxZoom": 8,
"minExposure": -20,
"minFocusDistance": 0,
"minZoom": 1,
"name": "0 (BACK) androidx.camera.camera2",
"neutralZoom": 1,
"physicalDevices": [
"wide-angle-camera"
],
"position": "back",
"sensorOrientation": "landscape-left",
"supportsFocus": true,
"supportsLowLightBoost": false,
"supportsRawCapture": false
}
Device
Galaxy S9
VisionCamera Version
4.3.2 and 4.4.1
Can you reproduce this issue in the VisionCamera Example app?
I didn't try (⚠️ your issue might get ignored & closed if you don't try this)
Additional information
- [X] I am using Expo
- [X] I have enabled Frame Processors (react-native-worklets-core)
- [X] I have read the Troubleshooting Guide
- [X] I agree to follow this project's Code of Conduct
- [X] I searched for similar issues in this repository and found none.
Guten Tag, Hans here.
[!NOTE] New features, bugfixes, updates and other improvements are all handled mostly by
@mrousavyin his free time. To support@mrousavy, please consider 💖 sponsoring him on GitHub 💖. Sponsored issues will be prioritized.
@mrousavy just an update. I got my hands on a newer android (Galaxy S21) and rebuit it for that device however the same issue is occurring as on the S9
const frameProcessor = useFrameProcessor(frame => {
"worklet";
try {
runAtTargetFps(3, () => {
const { width, height, orientation, bytesPerRow, pixelFormat } = frame;
const buffer = frame.toArrayBuffer();
const data = new Uint8Array(buffer);
const bytesPerPixel = 4;
console.log(`Frame: ${frame.width} ${frame.height}`); // 1280 720
console.log(`Data length: ${data.length}`); // 921600
console.log(`Expected length: ${width * height * bytesPerPixel}`); // 2764800
if (data.length !== width * height * bytesPerPixel) {
console.error(
"Data length does not match expected size. Check the frame format and conversion method."
);
}
});
} catch (err) {
console.log("frameProcessor err", err, (err as Error).message);
}
}, []);
the data.length is always frame.width * frame.height which doesn't make sense. Is the frame compressed in some way? Am i doing something wrong?
Thanks for your help
Is the frame compressed in some way
No, but it might have multiple planes and toArrayBuffer() might only return the first plane (Y) or (R)
same issue
Same issue on android also
Yeah same Problem. Some plane is missing for sure because the image has wrong coloring.
Yea I'm having the same issue. I was trying on a frame size same as yours (640x480) and was expecting the Uint8Array to be 640 * 480 * 4 = 1 228 800 but it turned out to be 307200. I'm actually doing some matrix operations on the buffer and when I was trying to check if my operations were correct I saw that they were but the buffer is incomplete.
Update from my side I managed to fix this for my specific case in which my pixel format is RGBA by patching the c++ toArrayBuffer function
I changed:
size_t size = bufferDescription.height * bufferDescription.stride;
to
size_t size = bufferDescription.height * bufferDescription.stride * 4;
I think a proper fix would be to parse the frame's pixelFormat and adjust the buffer size accordingly; Leaving this info in case anyone wants to work on it. I might do it later if I find the time.
interesting find