VideoCore-Inactive
VideoCore-Inactive copied to clipboard
How to stream video from a different source (GPUImage) - GLESVideoMixer?
Is there a way to have a different video input other than the iphone camera ? Im using GPUImage Framework, and I would like to feed the stream with the GPUImage output.
I wonder if there is any sample or someone who has ever replaced the default iphone camera video source with something else, I see there is a ``GLESVideoMixer` class, but how do you use it ?
+1
@jgh- how fast can
- (void) addPixelBufferSource: (UIImage*) image withRect: (CGRect) rect;
Be called ? Im running a callback block from GPUImage framework, that outputs raw data super fast... and I take that data to build an imageRef and redirect it to VideoCore (I modified the addPIxelBuffer method to receive a CGImageRef instead of UIImage), but apparently it crashes after 3 or 4 seconds of displaying content. I get "Null Texture !" printed in the console many times before crashing, at least it shows something. Maybe the method that builds the CGImageRef is the one not able to handle executions so fast + addPixelBufferSource method processing that imageref.
I can also provide a GLuByte but I don't know if it needs special formatting. For the m_pixelBufferSource->pushPixelBuffer(rawData, width * height * 4);
to work
Any ideas how can I overcome this situation ?
If you're using the raw pixelbuffer source you should be using a CVPixelBufferRef for it. It expects that the pointer you pass into pushPixelBuffer be a CVPixelBufferRef. It should be able to handle high frame rates.
+1
I couldn't just simply pass a CVPixelBufferRef, it only accepts unsigned char so I had to do the following...
-(void)addPixelBufferSource:(CVPixelBufferRef)bufferRef realImageSize:(CGSize)size andRect:(CGRect)rect{
NSInteger width = size.width;
NSInteger height = size.height;
CVPixelBufferLockBaseAddress(bufferRef, 0);
GLubyte *rawImageBytes = (GLubyte *)CVPixelBufferGetBaseAddress(bufferRef); //CASTED I DONT kNOW WHY
size_t dataSize = CVPixelBufferGetDataSize(bufferRef);
size_t bufferHeight = CVPixelBufferGetHeight(bufferRef);
size_t bufferWidth = CVPixelBufferGetWidth(bufferRef);
unsigned char* buffer = (unsigned char*)malloc( dataSize );
for (int y=0; y<bufferHeight; y++) {
for (int x=0; x<bufferWidth; x++) {
buffer[y * bufferWidth * 4 + 4 * x + 0] = rawImageBytes[y * bufferWidth * 4 + 4 * x + 0];
buffer[y * bufferWidth * 4 + 4 * x + 1] = rawImageBytes[y * bufferWidth * 4 + 4 * x + 1];
buffer[y * bufferWidth * 4 + 4 * x + 2] = rawImageBytes[y * bufferWidth * 4 + 4 * x + 2];
buffer[y * bufferWidth * 4 + 4 * x + 3] = rawImageBytes[y * bufferWidth * 4 + 4 * x + 3];
}
}
CVPixelBufferUnlockBaseAddress(bufferRef, 0);
if(m_pixelBufferSource){
m_pixelBufferSource->pushPixelBuffer(buffer, width * height * 4);
}else{
m_pixelBufferSource = std::make_shared<videocore::Apple::PixelBufferSource>(width, height, kCVPixelFormatType_32BGRA);
m_pbAspect = std::make_shared<videocore::AspectTransform>(rect.size.width,rect.size.height,videocore::AspectTransform::kAspectFit);
m_pbPosition = std::make_shared<videocore::PositionTransform>(rect.origin.x, rect.origin.y,
rect.size.width, rect.size.height,
self.videoSize.width, self.videoSize.height
);
m_pixelBufferSource->setOutput(m_pbAspect);
m_pbAspect->setOutput(m_pbPosition);
m_pbPosition->setOutput(m_videoMixer);
m_videoMixer->registerSource(m_pixelBufferSource);
m_pixelBufferSource->pushPixelBuffer(buffer, width * height * 4);
}
free(buffer);
}
I don't know if its efficient or if its the right way to do it. But it works... for now.
@omarojo @jgh- If I have the video buffer in h264 encoded format already, can I call the pushPixelBuffer() method of m_pixelBufferSource to push the buffer every time I intercept the video from my custom source? Do I still have to modify the addPixelBufferSource method of VCSimpleSession?
Thanks!
@AkshayBudhiraja You have to modify the addPixelBufferSource method. Because that method currently receives an UIImage. In my sample, thats what I did, I modified the method so that is accepts a CVPixelBufferRef instead of a UIImage. And yes, I believe you can call m_pixelBufferSource->pushPixelBuffer on every frame.
@omarojo I was going over this issue and I'm bit confused about what you put and removed from your addPixelBufferSource, could you share your addPixelBufferSource function as a reference?
Thanks
@odemiral I updated my previous post with the complete method. Check it out.
@omarojo Thank You! I'll go over it right now.
@omarojo Thanks for your help! In addition to what @odemiral said, I was wondering how you got rid of the video from the iPhone camera that is currently being used by default? I wasn't able to find where it's actually being called. Also, I am capturing the video data in another view controller using an external SDK in a didGetVideo() like function where I get a h264 video buffer on every frame. Do I have to call m_pixelBufferSource->pushPixelBuffer from within this function or use some reference I created in addPixelBufferSource?
Much appreciated!
@omarojo Using your modified addPixelBufferSource code, I added the following code to the viewDidLoad function in my viewController:
_session = [[VCSimpleSession alloc] initWithVideoSize:CGSizeMake(1280, 720) frameRate:30 bitrate:1000000 useInterfaceOrientation:NO];
CVPixelBufferRef pxbuffer = nil;
// NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: nil];
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, 512,
512, kCVPixelFormatType_32BGRA, nil,
&pxbuffer);
NSLog(@"Status is: %d", status);
[_session addPixelBufferSource:&pxbuffer withSize:(CGSizeMake(512, 512)) withRect:(CGRectMake(100, 100, 512, 512))];
_session.delegate = self;
However, on running it I get the following "unrecognized selector sent to instance" error:
2015-11-18 00:05:55.441 SampleBroadcaster[401:182521] -[VCSimpleSession addPixelBufferSource:withSize:withRect:]: unrecognized selector sent to instance 0x134ecae50 2015-11-18 00:05:55.441 SampleBroadcaster[401:182521] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[VCSimpleSession addPixelBufferSource:withSize:withRect:]: unrecognized selector sent to instance 0x134ecae50'
I was wondering if you ran into the same problem or if I'm doing something wrong? Thanks!
The name of the method is different from the original in the VideoCore Library. Remember to expose or change the name of the method in the .h file, so that it can be used.
-- Ing. Omar Juárez
On November 18, 2015 at 12:23:04 AM, Onur D ([email protected]) wrote:
@omarojo Using your modified addPixelBufferSource code, I added the following code to the viewDidLoad function in my viewController:
_session = [[VCSimpleSession alloc] initWithVideoSize:CGSizeMake(1280, 720) frameRate:30 bitrate:1000000 useInterfaceOrientation:NO];
CVPixelBufferRef pxbuffer = nil;
// NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: nil];
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, 512,
512, kCVPixelFormatType_32BGRA, nil,
&pxbuffer);
NSLog(@"Status is: %d", status);
[_session addPixelBufferSource:&pxbuffer withSize:(CGSizeMake(512, 512)) withRect:(CGRectMake(100, 100, 512, 512))];
_session.delegate = self;
However, on running it I get the following "unrecognized selector sent to instance" error:
2015-11-18 00:05:55.441 SampleBroadcaster[401:182521] -[VCSimpleSession addPixelBufferSource:withSize:withRect:]: unrecognized selector sent to instance 0x134ecae50 2015-11-18 00:05:55.441 SampleBroadcaster[401:182521] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[VCSimpleSession addPixelBufferSource:withSize:withRect:]: unrecognized selector sent to instance 0x134ecae50'
I was wondering if you ran into the same problem or if I'm doing something wrong? Thanks!
— Reply to this email directly or view it on GitHub.
Very interesting topic, I'm working on the same "VideoCore <-> GPUImage" link atm. I removed the camera and mic from the graph and used the code snippet above:
let imgSize = CGSize(width: 640, height: 480)
let rawOutput = GPUImageRawDataOutput(imageSize: imgSize, resultsInBGRAFormat: true)
var session : VCSimpleSession = VCSimpleSession(videoSize: imgSize, frameRate: 30, bitrate: 1000000, useInterfaceOrientation: false)
session.delegate = self
rawOutput.newFrameAvailableBlock = {
self.rawOutput.lockFramebufferForReading()
let outputBytes = self.rawOutput.rawBytesForImage
let bytesPerRow = self.rawOutput.bytesPerRowInOutput()
if bytesPerRow == 0 {
return
}
var pxbuffer : CVPixelBufferRef? = nil
CVPixelBufferCreateWithBytes(kCFAllocatorDefault, Int(imgSize.width), Int(imgSize.height), kCVPixelFormatType_32BGRA, outputBytes, Int(bytesPerRow), nil, nil, nil, &pxbuffer);
self.session.addPixelBufferSource(pxbuffer, realImageSize: imgSize, andRect: CGRect(origin: CGPoint(x: 0, y: 0), size: imgSize))
self.rawOutput.unlockFramebufferAfterReading()
}
lowerView.addSubview(viewOutput)
viewOutput.frame = lowerView.bounds
videoCamera = GPUImageVideoCamera(sessionPreset: AVCaptureSessionPreset640x480, cameraPosition: .Back)
videoCamera!.outputImageOrientation = .Portrait;
videoCamera?.addTarget(viewOutput)
videoCamera?.addTarget(rawOutput)
videoCamera?.startCameraCapture()
session.startRtmpSessionWithURL("rtmp://xxx/hls", andStreamKey: streamId)
Somehow, there's still no throughput... does anybody know what I'm doing wrong?
I also want to know how to get input from GPUImage anyone can show me some code
:+1:
Is what you guys are doing a hack? Is this the correct way to use a custom video source? Does anyone have a sample of the changes they made to get this working using a custom video source?
@omarojo would be really awesome if you could provide some more information. Especially how you did the GPU-part and then "feed" VideoCore...
Look at this example: GDLRawDataOutput
Hey, finally got this working. However, the images I'm sending are upside down, and the colors are off, blues look purple, etc. Any ideas? Thanks