GPUImage
GPUImage copied to clipboard
Video corrupted with 1920x1080 resolution and rawDataOutput
Hi all,
when I use rawDataOutput with any resolution below 1920x1080, the live image on the screen looks fine. However, when I select AVCaptureSessionPreset1920x1080 then the screen shows a messed up image. In the test I did not apply any filter. Is this a bug? (Tested on iPhone 5S) Thanks for the help!
Here is the relevant code:
// Initialization
CGSize videoSize = CGSizeMake(1080.0, 1920.0);
GPUImageRawDataOutput *rawDataOutput = [[GPUImageRawDataOutput alloc] initWithImageSize:videoSize resultsInBGRAFormat:YES];
GPUImageRawDataInput *rawDataInput = [[GPUImageRawDataInput alloc] initWithBytes:[rawDataOutput rawBytesForImage] size:videoSize];
[stillCamera addTarget:rawDataOutput]; // TEST
// Callback on raw data output
__weak GPUImageRawDataOutput *weakOutput = rawDataOutput;
[rawDataOutput setNewFrameAvailableBlock:^{
GLubyte *outputBytes = [weakOutput rawBytesForImage];
// Update rawDataInput
[rawDataInput updateDataFromBytes:outputBytes size:videoSize];
[rawDataInput processData];
}];
// Send unprocessed image to image view for display
[rawDataInput addTarget:filterView];
[stillCamera startCameraCapture];
This might be related to issue #1517, where image sizes that weren't multiples of 16 weren't handled correctly due to some kind of padding here (haven't quite tracked down the source of that).
Sorry my comment here was incorrect. What I have written in the comment above is the correct buggy behavior.
I have confirmed that the same happens for a resolution of 480x360.
It looks like in your code your dimensions are backwards, it should be width x height.
CGSize videoSize = CGSizeMake(1080.0, 1920.0);
should be:
CGSize videoSize = CGSizeMake(1920.0, 1080.0);
both 1080 and 360 are not divisible by 16, so that could be it.
I can confirm that there is something odd going on with this flow, possibly with GPUImageRawDataOutput itself.
Using the code he has above, with this flow: GPUImageMovie -> GPURawDataOutput -> GPURawDataInput -> GPUImageFilter, I get an image that looks almost nothing like my original displayed video. (Recording is from GPURawDataInput with GPUImageMovieWriter, before sending from rawDataInput to filter, using JPEG encoder)
If I instead skip both rawData filters by using GPUImageMovie -> GPUImageFilter, I get an image very close to what was displayed while I recording, although a bit grainier (Probably do to JPEG encoder).
This is with an 80x80 video size, so in a weird realm comparatively speaking, but definitely divisible by 16.
I may have spoken too soon, looking into the issue further seems to show that it's a pixel format issue, I'm initializing my RawDataInput as "GL_RGB", which I'm thinking may be incompatible with RawDataOutput?
Yes, I'm sorry that I've taken up some space here, my issue was that GPUImageRawDataOutput outputs in RGBA, and my GPUImageRawDataInput was marked as RGB. As soon as I converted it to RGBA, all was right with the world.
Solution here https://github.com/BradLarson/GPUImage2/pull/183