SCRecorder icon indicating copy to clipboard operation
SCRecorder copied to clipboard

Live filter in the record scene, rather than select the filter after recored

Open fayhot opened this issue 10 years ago • 43 comments
trafficstars

I got another problem. I try to have a live filter when record in the Record View, rather than a recored filter in the post deal view. So I did set the CIImageRenderer (as commented, /** If set, this render will receive every received frames as CIImage. Can be useful for displaying a real time filter for example. */).

The way i set the CIImageRenderer is the same as the demo , however, there's no effect.

Is there any help ? thanks

fayhot avatar Jul 13 '15 04:07 fayhot

If you figure this out, let me know! I've been trying to come up with a solution to this as well

anthonycastelli avatar Jul 15 '15 03:07 anthonycastelli

@anthonycastelli
I got the reason. All the things go right except the last one.

As the camera go on live, the delegate method are triggered. however, in the last step, the setNeedsDisplay failed to trigger the - (void)drawRect:(CGRect)rect method of SCImageView. So I trigger it by hand in the method, which is - (void)setImageBySampleBuffer:(CMSampleBufferRef)sampleBuffer, there's still no response. Out of my expected.

I think it's must be some reason of the lower api, however , i have no idea about that. So I tried another way. Which is dirty, but really works. explain: never render the preview, but just set a same frame uiimageview above the preview. as the preview buffer changes, reset the uiimageview.

  1. create a new object that implement the CIImageRenderer.(Just like SCImageView is OK)
  2. override the - (void)setImageBySampleBuffer:(CMSampleBufferRef)sampleBuffer function.
  3. set a delegate
  4. as setImageBySampleBuffer are trigger , you can reset the above uiimageview by a delegate method.

the following is a demo capture. I'm new to objc, I'll manage the code to add some selector.

image_1437038910 721773

fayhot avatar Jul 16 '15 09:07 fayhot

That might work. Im sure there are better ways to accomplish this. Snapchat does live filters as well as Geofilters and I am pretty sure they aren't using a UIImageView and setting a CIImage 30/60 times a second, whatever the preview is. @rFlex Do you have any suggestions or ideas for this one?

anthonycastelli avatar Jul 16 '15 15:07 anthonycastelli

Setting a CIImageRenderer inside SCRecorder will not change the buffer that are actually recorded. It will only make the CIImageRenderer as receiver of the image buffers from the camera, so you can display a live filter. If you want the filters to be actually applied to the file, you will need to set the filter inside SCVideoConfiguration (which handle the output video configuration). Was that your question?

rFlex avatar Jul 18 '15 16:07 rFlex

Oh, great. Thanks for your answer, that hits. By the way, "you can display a live filter", is it possible to be done just by the library, without much more stuff code

fayhot avatar Jul 18 '15 17:07 fayhot

@fayhot I’m using a SCSwipeableFilterView widget. My setup is:

self.filterSwitcherView.refreshAutomaticallyWhenScrolling = FALSE;
self.filterSwitcherView.contentMode = UIViewContentModeScaleAspectFill;

self.filterSwitcherView.filters = @[
                                            [SCFilter emptyFilter],
                                            [SCFilter filterWithCIFilterName:@"CIPhotoEffectChrome"],
                                            [SCFilter filterWithCIFilterName:@"CIPhotoEffectInstant"],
                                            [SCFilter filterWithCIFilterName:@"CIPhotoEffectTonal"]
                                            ];
recorder.CIImageRenderer = self.filterSwitcherView;

This is not working good, it’s slow. Looks a delay while rendering. I don’t know what’s happening.

xportation avatar Aug 24 '15 03:08 xportation

@fayhot Turns out, you just simply need to do _filterSwitcherView.CIImage = [CIImage imageWithColor:[CIColor colorWithRed:0 green:0 blue:0]]; (could be any arbitrary color) in viewDidLoad of the view controller that has the SCSwipeableFilterView. This will ultimately call the private method _loadContext which sets up the essential EAGLContext and CIContext for the internal GLKView to actually start rendering the filtered sample buffers.

xezero avatar Sep 10 '15 07:09 xezero

@xezero, you're right. And I got the result today. And It's the same. Actual i read all the source code for 3 days.

You're great.

fayhot avatar Sep 10 '15 16:09 fayhot

@xportation

Thanks a lot. Your code makes sense.

I've read the SCRecorder source code, there's no better way than yours.

fayhot avatar Sep 10 '15 16:09 fayhot

@fayhot

Thanks! Glad it worked for you! :)

xezero avatar Sep 10 '15 16:09 xezero

@xezero @fayhot How do you resolve the problem?

I setup in viewDidLoad

// @IBOutlet weak var previewView: SCImageView!

let _randomCIImage = CIImage(color: CIColor(red: 0, green: 0, blue: 0))

// CIImage has infinite extent, crop back to normal size
let cropFilter = CIFilter(name: "CICrop", withInputParameters: [ "inputImage": _randomCIImage, "inputRectangle": CIVector(CGRect: CGRectMake(0, 0, UIScreen.mainScreen().bounds.size.width, UIScreen.mainScreen().bounds.size.height)) ])

self.previewView.CIImage = cropFilter?.outputImage

and in later somewhere (after viewDidAppear)

// var recorder: SCRecorder!
self.recorder.previewView = self.previewView

self.previewView.filter = self.processTheme?.filter()?.scFilter // some SCFilter which is not null

What I see is the correct camera view with no filter on it, the drawRect is not called in every FPS, it is called only on setFilter:(SCFilter *)filter

any ideas?

I have tried to assign the SCImageView to recorder.CIImageRenderer, which results in a filtered view but > 1000 ms lags in the camera feed (even in iPhone 6)

However, the lag does not happen on iOS8 somehow, it appears on iOS9 suddenly.

@rFlex Do you have idea on this?

hkbenchan avatar Sep 23 '15 09:09 hkbenchan

About Q1 Maybe something wrong with your preview, that must be implement the CIImageRenderer. Use SCImageView, SCSwipeableFilterView, or any custom uiview that implement the above protocol. As you say, i guess you made a test. However, you make a wrong usage.

self.recorder.CIImageRender = id<CIImageRenderer>

Certainly, self.previewView.CIImage = cropFilter?.outputImage go wrong way.

Unless you want make a realtime Comparison, there's no need to set the recorder preview. self.recorder.CIImageRender = id<CIImageRenderer> has actually send the raw image data to the render view, been redrawn in the GLKView. Once you have add the render view in the controller root view, the filtered image will came out.

About Q2 It's a extra problem of the Q1. As the core image has a bed speed, you should be careful of the CIFIlter you choose. CIFilter cost within 20ms are acceptable. And you can custom the filter with open GL. Of course, the apple make some rule and lots of limit.

fayhot avatar Sep 25 '15 10:09 fayhot

It's because the context is actually never initialized. It's not documented and I had to dig through the code to realize it. Try creating a CIImage from an arbitrary CIColor and setting that to the swipeable filter view's CIImage property in viewDidLoad:

swipeableFilterView.CIImage = [CIImage imageWithColor:[CIColor colorWithRed:0 green:0 blue:0 alpha:1.0]];

drawRect should then be called once you've done that. You also don't want to set a preview view if you're using a CIImageRenderer.

On Sep 23, 2015, at 2:55 AM, ustbenchan [email protected] wrote:

@xezero @fayhot How do you resolve the problem?

I setup in viewDidLoad

// @IBOutlet weak var previewView: SCImageView!

let _randomCIImage = CIImage(color: CIColor(red: 0, green: 0, blue: 0))

// CIImage has infinite extent, crop back to normal size let cropFilter = CIFilter(name: "CICrop", withInputParameters: [ "inputImage": _randomCIImage, "inputRectangle": CIVector(CGRect: CGRectMake(0, 0, UIScreen.mainScreen().bounds.size.width, UIScreen.mainScreen().bounds.size.height)) ])

self.previewView.CIImage = cropFilter?.outputImage and in later somewhere (after viewDidAppear)

// var recorder: SCRecorder! self.recorder.previewView = self.previewView

self.previewView.filter = self.processTheme?.filter()?.scFilter // some SCFilter which is not null What I see is the correct camera view with no filter on it, the drawRect is not called in every FPS, it is called only on setFilter:(SCFilter *)filter

any ideas?

I have tried to assign the SCImageView to recorder.CIImageRenderer, which results in a filtered view but > 1000 ms lags in the camera feed (even in iPhone 6)

However, the lag does not happen on iOS8 somehow, it appears on iOS9 suddenly.

@rFlex Do you have idea on this?

— Reply to this email directly or view it on GitHub.

xezero avatar Sep 25 '15 17:09 xezero

@fayhot

Like @xezero said, the reason I put the CIImage thing in the viewDidLoad, it's because the context is never initialized unless a solid CIImage is passed in.

Actually, I did not set the preview view if I use the CIImageRenderer, I directly set the CIImageRenderer of SCRecorder to my SCImageView

What's being weird is the same code running smoothly in iOS8, suddenly becomes very lag on iOS9.

In iOS8, the CIImageRenderer can receive correct image frame in each FPS, while in iOS9, the CIImageRenderer receive image frame that are at least 1000ms ago. So I am wondering what makes the difference.

hkbenchan avatar Sep 28 '15 05:09 hkbenchan

I'm noticing the same thing in iOS 9.0. Trying to figure out whats up.. Will update if I do!

On Sep 27, 2015, at 10:38 PM, ustbenchan [email protected] wrote:

@fayhot

Like @xezero said, the reason I put the CIImage thing in the viewDidLoad, it's because the context is never initialized unless a solid CIImage is passed in.

Actually, I did not set the preview view if I use the CIImageRenderer, I directly set the CIImageRenderer of SCRecorder to my SCImageView

What's being weird is the same code running smoothly in iOS8, suddenly becomes very lag on iOS9.

In iOS8, the CIImageRenderer can receive correct image frame in each FPS, while in iOS9, the CIImageRenderer receive image frame that are at least 1000ms ago. So I am wondering what makes the difference.

— Reply to this email directly or view it on GitHub.

xezero avatar Sep 28 '15 06:09 xezero

@ustbenchan More params are needed. The platform and the system version in detail. There's some terrible bugs on ios9. The CoreImage take much more memory. I wrote some opengl filters like bilateral shader take more than 100ms.

However, the basic filters supply by apple CoreImage wrapped as CIFilter take little , it's about 20ms. There must be something wrong with your usage

fayhot avatar Sep 28 '15 16:09 fayhot

I'm experiencing the same issues listed above. A considerable amount of lag from setting the CIImage of the filter view. After removing these lines from viewDidLoad, the video records smoothly.

CIImage *randomImage = [CIImage imageWithColor:[CIColor colorWithRed:0 green:0 blue:0]];
CIFilter *cropFilter = [CIFilter filterWithName:@"CICrop" withInputParameters:@{@"inputImage": randomImage,
                                                                                    @"inputRectangle": [CIVector vectorWithCGRect:self.view.bounds]
                                                                                    }];
_filterView.CIImage = cropFilter.outputImage;

Reproduced on an iPhone 6 (iOS 9) Xcode 7.0

jajhar avatar Sep 28 '15 18:09 jajhar

Let me prepared a simple project to demo this problem.

Will finish wrapping up in these two or three days.

hkbenchan avatar Sep 29 '15 07:09 hkbenchan

@jajhar Try just doing _filterView.CIImage = [CIImage imageWithColor:[CIColor colorWithRed:0 green:0 blue:0]];

@ustbenchan Try turning off video stabilization in SCRecorder.m, Line 979 to see if that fixes your latency issues: videoConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeOff; videoConnection.enablesVideoStabilizationWhenAvailable = NO;

xezero avatar Sep 29 '15 18:09 xezero

@xezero the second change worked for me. Setting these two:

videoConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeOff; videoConnection.enablesVideoStabilizationWhenAvailable = NO;

jajhar avatar Sep 29 '15 21:09 jajhar

@jajhar Awesome! :)

xezero avatar Sep 29 '15 21:09 xezero

@xezero This works! Not sure why this makes the latency thing appeared on iOS 9. And I would prefer not to directly modifying the library as we import as a Cocoapods dependency.

@rFlex Actually is it possible to expose the videoConnection thing? Or at least give us a block to configure both Video and Audio input before startRunning?

In case someone is looking for a temporary fix, here is a swift version


import Foundation
import AVFoundation

extension SCRecorder {

  private func _videoConnection() -> AVCaptureConnection? {

    if let _outputs = self.captureSession?.outputs {

      for output in _outputs {
        if let _captureOutput = output as? AVCaptureVideoDataOutput {

          for connection in _captureOutput.connections {
            if let captureConnection = connection as? AVCaptureConnection {

              for port in captureConnection.inputPorts {
                if let _port = port as? AVCaptureInputPort {
                  if _port.mediaType == AVMediaTypeVideo {
                    return captureConnection
                  }
                }
              }
            }
          }
        }
      }
    }


    return nil

  }

  func attemptTurnOffVideoStabilization() {

    self.beginConfiguration()

    let videoConnection = self._videoConnection()
    if let connection = videoConnection {

      if connection.supportsVideoStabilization {
        connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationMode.Off
      }

    }

    self.commitConfiguration()

  }

}

Just call

recorder.attemptTurnOffVideoStabilization()

after

recorder.startRunning()

hkbenchan avatar Sep 30 '15 06:09 hkbenchan

@ustbenchan I like your extension! I did something similar to turn it off:

- (void)reconfigureVideoConnection {
    // We'll disable video stabilization for now so we don't get any latency
    // We also need to reconfigure this video connection everytime the device is initalized or changed (i.e. front -> back -> front)
    for (AVCaptureConnection * connection in _recorder.videoOutput.connections) {
        for (AVCaptureInputPort * port in connection.inputPorts) {
            if ([port.mediaType isEqual:AVMediaTypeVideo]) {
                if (connection.isVideoStabilizationSupported) {
                    connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeOff;

                    #pragma clang diagnostic push
                    #pragma clang diagnostic ignored "-Wdeprecated-declarations"
                    connection.enablesVideoStabilizationWhenAvailable = NO;
                    #pragma clang diagnostic pop
                }

                return;
            }
        }
    }
}

One thing to note is that if you were to switch camera devices (i.e. front to back) while the session is still active, you will need to reconfigure the video connection as it gets reset.

xezero avatar Sep 30 '15 08:09 xezero

Opened up an issue for the latency with video stabilization as well: https://github.com/rFlex/SCRecorder/issues/217

xezero avatar Sep 30 '15 08:09 xezero

what the last solution?

dimohamdy avatar Dec 12 '15 19:12 dimohamdy

Does anyone have an example of displaying AND rendering/saving a live filter? I don't see anything in this thread or in the examples that actually demonstrate how to apply a live filter (one that you see while recording) and save the filter on export.

If I add a filter to the videoConfiguration, it doesn't display on the preview and when it's saved, it's just a black screen. Is doing live filters possible? The answer isn't clear.

cerupcat avatar Jan 29 '16 23:01 cerupcat

@cerupcat SCRecorder has a SCImageView property (It used to be CIImageRenderer before) which is what you are looking for. I guess wherever you see CIImageRenderer in this thread you read it as SCImageView

renjithn avatar Jan 30 '16 11:01 renjithn

Thanks @renjithn, however following examples above, I still can't get any live filter to show up. I'm currently trying with the following code, but it just shows the regular input of the camera. The filter is also not applied to the output. I'm surprised there's no demo of live filters in the example project.

    SCFilter *blurFilter = [SCFilter filterWithCIFilterName:@"CIGaussianBlur"];
    [blurFilter.CIFilter setValue:[NSNumber numberWithFloat:100] forKey:kCIInputRadiusKey];

    self.filterSwitcherView = [[SCSwipeableFilterView alloc] initWithFrame:previewView.frame];
    self.filterSwitcherView.refreshAutomaticallyWhenScrolling = FALSE;
    self.filterSwitcherView.contentMode = UIViewContentModeScaleAspectFill;
    self.filterSwitcherView.filters = @[
                                      blurFilter
                                        ];
    _recorder.SCImageView = self.filterSwitcherView;

cerupcat avatar Jan 30 '16 23:01 cerupcat

@cerupcat In the above I don't see self.filterSwitcherView being added to any view. May be thats whats missing. i.e [self.view addSubView:self.filterSwitcherView]; or better you could have it in the storyboard.

The above would only "preview" the filters. For exporting you will need to set the selected filter in SCVideoConfiguration so that it will reflect in the output.

renjithn avatar Jan 31 '16 12:01 renjithn

Thanks @renjithn. If I add filterSwitcherView as a subview, it's just a black screen. If I add it to the SCVideoConfiguration it's also just a black screen on output. I've only tried the blur filter so far though, so I'm not sure if that's the issue or not.

cerupcat avatar Jan 31 '16 22:01 cerupcat