SCRecorder icon indicating copy to clipboard operation
SCRecorder copied to clipboard

How Can I Add Face Detector?

Open ToanNguyenCong opened this issue 10 years ago • 5 comments

I want to put face detector by using CoreImage but cannot. Please tell me what I need to do?

ToanNguyenCong avatar Oct 01 '15 09:10 ToanNguyenCong

I'd like to know this too. How can we do this with SCFilter()

felixchan avatar Nov 25 '15 11:11 felixchan

I'd like to know this too.

qingfeng avatar Oct 09 '16 10:10 qingfeng

I managed to achieve this recently in an iMessage app using SCRecorder.

I subclassed SCRecorder and overrode the captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection method detecting faces in each sample buffer:

- (void)setupCIDetector
{
    faceDetector = [CIDetector detectorOfType:CIDetectorTypeFace
                                      context:nil
                                      options:@{CIDetectorAccuracy:CIDetectorAccuracyLow}];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{
    if (!faceDetector) {
        [self setupCIDetector];
    }

    CVPixelBufferRef pixel_buffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CFDictionaryRef attachments = CMCopyDictionaryOfAttachments(kCFAllocatorDefault,
                                                                sampleBuffer,
                                                                kCMAttachmentMode_ShouldPropagate);

    CIImage *ci_image = [[CIImage alloc] initWithCVPixelBuffer:pixel_buffer
                                                       options:(__bridge NSDictionary *)attachments];
    (attachments)?CFRelease(attachments):nil;


    NSArray *features = [faceDetector featuresInImage:ci_image
                                              options:nil];

    [super captureOutput:captureOutput didOutputSampleBuffer:sampleBuffer fromConnection:connection];
}

Now that we've detected faces, you need to update the on screen props to be displayed in the correct position. Using the above code you can position props in real time, but you will need to implement something else for the export. In my case I was recording video and exporting as a GIF, In this case I used CIDetector on each frame of the GIF export to calculate a correct position for the props then I used UIGraphics to draw the overlay view onto the image.

Hope this helps! It is definitely possible, even with the use of 3D objects! ~~Here's a video of a gif iMessage sticker generated from my app: https://sendvid.com/ww2wi55h~~

EDIT: Since the video is no longer available, heres the iMessage app I built using the code above: https://itunes.apple.com/us/app/sticker-booth-animated-gif/id1157522905?mt=8

twomedia avatar Oct 09 '16 13:10 twomedia

Great Work @twomedia !

Could you upload a demo to understand how to track 3d objects in the face?

It would be great!

Regards

vexelgray avatar Oct 26 '16 14:10 vexelgray

@rFlex @twomedia @vexelgray How do I access the features Array from the ViewController that is displaying my CameraView. So far I have created subclass of SCRecorder called Detector that does what you have done above. Then I created instance of it as let myDetector = Detector() print(myDetector.allFeatures.count) but the size of this array and it's 0.

` import UIKit import CoreImage import SCRecorder

class Detector: SCRecorder {

var faceDetector = CIDetector()

var allFeatures = [CIFeature]()

func setupCIDetector(){
    faceDetector = CIDetector(ofType: CIDetectorTypeFace, context: nil, options: [CIDetectorAccuracy: CIDetectorAccuracyHigh])!
}


override func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
    
    self.setupCIDetector()
    
    let pixel_buffer = CMSampleBufferGetImageBuffer(sampleBuffer)
    
    let attachments = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, sampleBuffer, kCMAttachmentMode_ShouldPropagate)
    
    if #available(iOS 9.0, *) {
        let ci_Image = CIImage(cvImageBuffer: pixel_buffer!, options: attachments as! [String : Any]?)

        
        allFeatures = self.faceDetector.features(in: ci_Image)
        print("😩: \(allFeatures.count)")
        
    } else {
        // Fallback on earlier versions
    }
    
    
    super.captureOutput(captureOutput, didOutputSampleBuffer: sampleBuffer, from: connection)
    
}

} `

AtoHenok avatar Jan 07 '17 17:01 AtoHenok