ASScreenRecorder icon indicating copy to clipboard operation
ASScreenRecorder copied to clipboard

View with 2 videos is not getting recorded

Open nihtin9mk opened this issue 10 years ago • 37 comments

I am using ASScreenRecorder in my app. My view controller has 2 views both playing videos - just like video chat in skype application.

When I try to record my screen - screen is recorded with only one video view the other one is missing in the video why it is happening ??

nihtin9mk avatar Sep 11 '14 19:09 nihtin9mk

Hi @nihtin9mk, There are certain types of view/layer that can't be automatically recorded - AVCaptureVideoPreviewLayer is one such case. To record these views you need to manually write the pixel data into the CGContext using the writeBackgroundFrameInContext:(CGContextRef*)contextRef method.

As you described your use case as a 'video chat' application, I assume this is probably the problem you are encountering. Is one of the video views a live preview?

Al

alskipp avatar Sep 11 '14 20:09 alskipp

@alskipp Hi, yes exactly this is the issue in my app. The view is a live preview.

Please give me a solution for this. I am so much thankful if you can provide some sample codes.

In my view controller I wrote the below codes

  -(void)viewDidLoad
   {
    [super viewDidLoad];
     [ASScreenRecorder sharedInstance].delegate = self;

    }


//myVideoView is the imageview with video preiview

  -(void)writeBackgroundFrameInContext:(CGContextRef *)contextRef{

   CGSize imgSize = [myVideoView.image size];
   UIGraphicsBeginImageContext(imgSize);
   contextRef = UIGraphicsGetCurrentContext();

    }

I believe this is not the proper way - please guide me.

nihtin9mk avatar Sep 12 '14 09:09 nihtin9mk

@alskipp Plz help me to do this

nihtin9mk avatar Sep 12 '14 16:09 nihtin9mk

Hi I'll do my best to point you in the right direction either later today or tomorrow. As you can probably appreciate this open source library doesn't pay the bills and I'm not independently wealthy, consequently I'm currently working for 'the man'.

As a quick pointer - you need to get direct access to the pixel data in your live video input. If memory serves correctly you'll need to implement a method in AVCaptureVideoDataOutputSampleBufferDelegate. Probably the easiest thing to do is to create a CGImage iVar in your controller that you continuously update in -captureOutput:didOutputSampleBuffer:FromConnection.

You then implement the delgate method like y

alskipp avatar Sep 12 '14 17:09 alskipp

Whoops. Typing this on phone and accidentally tapped close and comment.

Anyway. You need to implement the delegate method, but you then draw the CGImage (which you created in captureOutput:didOutputSampleBuffer… into the context.

alskipp avatar Sep 12 '14 17:09 alskipp

Hi @alskipp - Thank you for your help in this, and of course all your open source works are really helpful and great for the developers like me.

I am not so much familiar with AVFoundation and captureOutput:didOutputSampleBuffer things. Hope you can give a better picture on how to implement the delegate method. Please take your time.

And once again a lot of thanks for your selfless work.

nihtin9mk avatar Sep 12 '14 17:09 nihtin9mk

Hi, I'm on my way back home now. I'll try and post a few code examples this evening or tomorrow. If you get chance take a look at the documentation for AVCaptureVideoDataOutputSampleBufferDelegate as I think your view controller will need to implement this delegate to receive the live video data. You'll then use this pixel data to write into the video context.

Al

alskipp avatar Sep 12 '14 19:09 alskipp

I'll give a code example of how to turn the pixel data from captureOutput:didOutputSampleBuffer:FromConnection into an image you can use.

alskipp avatar Sep 12 '14 19:09 alskipp

OK. Here we go: First your view controller will need to implement AVCaptureVideoDataOutputSampleBufferDelegate - declared something like: @interface CameraViewController : UIViewController <AVCaptureVideoDataOutputSampleBufferDelegate>

As you're already previewing the video on screen I assume there's already an AVCaptureSession set up which is then used to init a AVCaptureVideoPreviewLayer.

In viewDidLoad you'll want to register to receive a notification for when your captureSession starts - something like:

[[NSNotificationCenter defaultCenter] addObserver:self
                                         selector:@selector(captureBegan)
                                             name:AVCaptureSessionDidStartRunningNotification
                                           object:self.captureSession];

Then you'll need to implement the method named in the selector:

- (void)captureBegan
{
    [ASScreenRecorder sharedInstance].delegate = self;
}

Don't forget to remove the delegate when the captureSession ends [ASScreenRecorder sharedInstance].delegate = nil.

The next comment will show the basics of AVCaptureVideoDataOutputSampleBufferDelegate it won't be the full implementation you need just yet. But let's just try and get an image created from the sampleBuffer and then we can continue from there…

alskipp avatar Sep 12 '14 21:09 alskipp

This doesn't do anything too useful yet - but let's see if the CGImageRef is successfully created, if so most of the hard work has been achieved, there are just a few more pieces to put in place.

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    CGImageRef image = [self createCGImageFromSampleBuffer:sampleBuffer];
    // check if image creation is successful
    CGImageRelease(image);
}

Here is how to get a CGImage from the sampleBuffer:

- (CGImageRef)createCGImageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer,0);

    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

    CGContextRef imageContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);

    CGImageRef newImage = CGBitmapContextCreateImage(imageContext);

    CGContextRelease(imageContext);
    CGColorSpaceRelease(colorSpace);
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    return newImage;
}

Let me know if all this works so far. Then I'll continue with the final bits that write the captured image into the video buffer.

(for now don't call [ASScreenRecorder sharedInstance].delegate = self; as we've not implemented the delegate method yet, so the compiler will complain. We'll get to that bit next). Good luck.

alskipp avatar Sep 12 '14 21:09 alskipp

Just realised I missed out a vital bit. Sorry! We need to declare ourselves as delegate to AVCaptureVideoDataOutput. Otherwise we won't receive any calls from captureOutput:didOutputSampleBuffer:FromConnection. I'll post some code examples in a moment.

alskipp avatar Sep 13 '14 09:09 alskipp

First we need a dispatch queue to receive calls to captureOutput:didOutputSampleBuffer:FromConnection - we don't want to block the main thread with video processing.

@property (strong, nonatomic) dispatch_queue_t videoQueue;

We then initialize the queue and use it when we declare ourselves as delegate to the captureSession output.

Here's some code that shows how the capture session is setup:

- (void)setupCamera
{
    AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    if (![captureDevice hasMediaType:AVMediaTypeVideo]) {
        return;
    }

    _videoQueue = dispatch_queue_create("CameraViewController.videoQueue", DISPATCH_QUEUE_SERIAL);

    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:nil];

    self.captureSession = [[AVCaptureSession alloc] init];

    AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];

    output.videoSettings = @{ (NSString*)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };
    [output setSampleBufferDelegate:self queue:_videoQueue];

    [self.captureSession addOutput:output];
    [self.captureSession addInput:input];
    [self.captureSession setSessionPreset:AVCaptureSessionPreset640x480];

    _previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];

    [self.cameraView setVideoPreviewLayer:_previewLayer];
}

The vital bit that you need to receive the sampleBuffers is:

[output setSampleBufferDelegate:self queue:_videoQueue];

Once that is declared you should start receiving calls to captureOutput:didOutputSampleBuffer:FromConnection

alskipp avatar Sep 13 '14 09:09 alskipp

Yes, this is a jigsaw puzzle with many pieces : )

alskipp avatar Sep 13 '14 09:09 alskipp

By the way if you're using a third party library to do the video capture then most of this boilerplate code should already be set up for you.

In that case you would set the 3rd party library as the delegate of ASScreenRecorder and locate where captureOutput:didOutputSampleBuffer:FromConnection is declared and add the extra code there.

alskipp avatar Sep 13 '14 12:09 alskipp

Hi @alskipp - everything works fine and the [self.delegate writeBackgroundFrameInContext:&bitmapContext]; getting called.

But them the app got crashed with this message - [VideoViewController writeBackgroundFrameInContext:]: unrecognized selector sent to instance

How can I implement this method ?

nihtin9mk avatar Sep 15 '14 12:09 nihtin9mk

Hi, I can post some example code, but not until after 19:00 GMT today. If I get a spare moment during the day, I'll try and point you in the right direction to complete the task.

Al

alskipp avatar Sep 15 '14 13:09 alskipp

Thank you so much @alskipp I have done the below but don't know what to do -

   - (void)captureOutput:(AVCaptureOutput *)captureOutput  didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {


   CGImageRef image = [self createCGImageFromSampleBuffer:sampleBuffer];
   UIImage* uiimage = [[UIImage alloc] initWithCGImage:image];
   NSLog(@"uiimage---%@",uiimage);

    CGImageRelease(image);

    }




   -(void)writeBackgroundFrameInContext:(CGContextRef *)contextRef{

     }

nihtin9mk avatar Sep 15 '14 13:09 nihtin9mk

Hi, The idea is reasonably simple, but due to threading the implementation needs to be very careful.

writeBackgroundFrameInContext will be called regularly - you just need to draw into the context with an image using CGContextDrawImage. We need to have a CGImage instance variable ready to draw into the context - however the details are a bit complicated.

writeBackgroundFrameInContext will be called on a background queue, the CGImage instance variable is mutable state which we will be updating elsewhere - we need to be very careful to prevent threading issues.

What we'll need: 2 instance vars

{
    CGImageRef _capturedImage;
    BOOL _needsNewImage; // set to YES before recording starts
}

1 dispatch_queue_t declared as a property - will be used to access _capturedImage

@property (strong, nonatomic) dispatch_queue_t imageQueue;

- (void)viewDidLoad {
    _imageQueue = dispatch_queue_create("CameraViewController.imageQueue", DISPATCH_QUEUE_SERIAL);
}

We need to update the CGImageRef:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    dispatch_sync(_imageQueue, ^{
        if (_needsNewImage) {
            _needsNewImage = NO;
            CGImageRelease(_capturedImage); // safe to use on NULL
            _capturedImage = [self createCGImageFromSampleBuffer:sampleBuffer];
        }
    });
}

Then to write the image into the context. You might need to adjust the context and position of the drawing for your own needs - the following is an example:

- (void)writeBackgroundFrameInContext:(CGContextRef*)contextRef;
{
    dispatch_sync(_imageQueue, ^{
        if (_capturedImage) {
            CGContextSaveGState(*contextRef);
            CGAffineTransform flipRotate = CGAffineTransformMake(0.0, 1.0, 1.0, 0.0, 0.0, 0.0);
            CGContextConcatCTM(*contextRef, flipRotate);

            CGContextDrawImage(*contextRef, CGRectMake(0,0, CGRectGetHeight(_cameraView.bounds), CGRectGetWidth(_cameraView.bounds)), _capturedImage);
            CGContextRestoreGState(*contextRef);

            _needsNewImage = YES;
        }
    });
}

The final thing to remember is to release _capturedImage when you have finished recording, otherwise the memory will leak.

alskipp avatar Sep 15 '14 18:09 alskipp

I've just edited the above post (forgot to call CGContextRestoreGState(*contextRef); - that's really important!).

Depending on how your view is positioned you might need to to adjust the position you draw in CGContextDrawImage. To make this easier to experiment with, it might be a good idea to change the following:

In ASScreenRecorder.m locate this code:

if (self.delegate) {
    [self.delegate writeBackgroundFrameInContext:&bitmapContext];
}
// draw each window into the context (other windows include UIKeyboard, UIAlert)
// FIX: UIKeyboard is currently only rendered correctly in portrait orientation
dispatch_sync(dispatch_get_main_queue(), ^{
    UIGraphicsPushContext(bitmapContext); {
        for (UIWindow *window in [[UIApplication sharedApplication] windows]) {
            [window drawViewHierarchyInRect:CGRectMake(0, 0, _viewSize.width, _viewSize.height) afterScreenUpdates:NO];
        }
    } UIGraphicsPopContext();
});

Move the delegate code to appear after the main drawing code listed above. Your video preview will then be drawn on top of everything else, making it easier to see if you are drawing it in the correct position.

alskipp avatar Sep 15 '14 19:09 alskipp

Ohh @alskipp - Thank you very much It worked at last.

Only two minor problems - may be you can point me the mistakes from my side. The myvideo view's alignment is wrong - it is rotated 90 degree right and situated left side. Also after I ends record the app crashes.

nihtin9mk avatar Sep 15 '14 19:09 nihtin9mk

"_cameraView" in my example is the view that contains the live video preview. It's only used to calculate the size and position to draw into the context. Could you confirm that a CGImage is being created in - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection

alskipp avatar Sep 15 '14 19:09 alskipp

If the image is created successfully then it's just a matter of getting the positioning right when drawing it into the context.

alskipp avatar Sep 15 '14 19:09 alskipp

Sorry for my first comment - I just forgot to move the delegate code to appear after the main drawing code listed above.Thats why that view is not came in the video. now it worked fine. only the above issue.

nihtin9mk avatar Sep 15 '14 19:09 nihtin9mk

Is everything working now? To get the correct positioning you'll have to experiment with the CGRect in CGContextDrawImage

CGContextDrawImage(*contextRef, CGRectMake(50, 50, 100, 100), _capturedImage);

alskipp avatar Sep 15 '14 19:09 alskipp

Yes now everything works fine. Thank you for your tremendous support.

Now I am experimenting with CGRect in CGContextDrawImage. But the orientation is also different in the video.

nihtin9mk avatar Sep 15 '14 19:09 nihtin9mk

If the orientation is incorrect you just have to transform the context. For my use, I needed the following:

CGAffineTransform flipRotate = CGAffineTransformMake(0.0, 1.0, 1.0, 0.0, 0.0, 0.0);
CGContextConcatCTM(*contextRef, flipRotate);

You could try and comment out that bit, or try a different transformation to see if it works.

alskipp avatar Sep 15 '14 19:09 alskipp

Yea I am trying for proper orientation.

At times it crashes on - BOOL success = [_avAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:time];

with ERROR - Thread 3: EXC_BAD_ACCESS (code=1, address=0x4bc000)

nihtin9mk avatar Sep 15 '14 19:09 nihtin9mk

Hmmm, the dreaded EXC_BAD_ACCESS. Has this only happened since adding the new code? I've not encountered the EXC_BAD_ACCESS crash, but threaded crashes are notoriously unpredictable. I'll have to think about what the cause could be.

alskipp avatar Sep 15 '14 20:09 alskipp

yea, after adding these new codes. It not happening continuously, but surely happens once in 3-4 uses.

nihtin9mk avatar Sep 15 '14 20:09 nihtin9mk

Hi @alskipp - any solutions ?

nihtin9mk avatar Sep 16 '14 19:09 nihtin9mk