BBMetalImage
BBMetalImage copied to clipboard
Filter out background and replace with color/image/video (Feature request)
Hi @Silence-GitHub
Following this tutorial I managed to filter out the background by creating a heat map of an object. Note there's also the option of attention-based saliency analysis instead of object-based one BUT this will only heat-map the face of a person if it sees one instead of the entire body (try the example app in the tutorial to see what I mean). Therefore, I recommend using object-based saliency analysis.
I can provide the code for the heat-map which is not much I guess but it is better than nothing so you can save some time.
Create an image view (with a set backgroundColor
) somewhere that is not covering the entire screen:
// e.g. I did this using Stevia autolayout
flashImage.Top == metalView.Top
flashImage.Left == metalView.Left
flashImage.height(240)
flashImage.width(135)
When I now switch on the object-based saliency analysis by setting camera.preprocessVideo = handleObject
the image of flashImage
will show the frame with the background filtered out and only the object showing.
func handleObject(in sampleBuffer: CMSampleBuffer) {
guard let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { /* FIXME: Error handling */ return }
let saliencyRequest = VNGenerateObjectnessBasedSaliencyImageRequest(completionHandler: { (request: VNRequest, error: Error?) in
DispatchQueue.main.async {
if let results = request.results as? [VNSaliencyImageObservation], let result = results.first {
self.handleObjectTracking(result, in: imageBuffer)
}
}
})
let imageRequestHandler = VNImageRequestHandler(cvPixelBuffer: imageBuffer, orientation: .up, options: [:])
try? imageRequestHandler.perform([saliencyRequest])
}
func handleObjectTracking(_ oberservedObject: VNSaliencyImageObservation, in imageBuffer: CVPixelBuffer) {
oberservedObject.pixelBuffer.normalize()
let normalImage = CIImage(cvImageBuffer: imageBuffer)
currentFrame = normalImage
var ciImage = CIImage(cvImageBuffer: oberservedObject.pixelBuffer)
let targetExtent = CIImage(cvImageBuffer: imageBuffer).extent
let heatmapExtent = ciImage.extent
let scaleX = targetExtent.width / heatmapExtent.width
let scaleY = targetExtent.height / heatmapExtent.height
ciImage = ciImage.transformed(by: CGAffineTransform(scaleX: scaleX, y: scaleY))
.applyingGaussianBlur(sigma: 20.0)
.cropped(to: targetExtent) // FIXME: needed?
showFlashlight(with: ciImage)
}
func showFlashlight(with heatMap: CIImage) {
guard let frame = currentFrame else { print("no flashlight possible"); return }
let mask = heatMap.applyingFilter("CIColorMatrix", parameters: ["inputAVector": CIVector(x: 0, y: 0, z: 0, w: 2)])
let spotlight = frame.applyingFilter("CIBlendWithMask", parameters: ["inputMaskImage": mask])
flashImage.image = UIImage(ciImage: spotlight)
}
We should now be able to set an imageSource
in the background to achieve this effect. It would be great if one could also add a videoSource
to the background by aligning the frames as shown in this video at 1:12 (although in this example a video is not only layered under the object-based frames but also another one over them). If you have some time on your hands, you could add filters one by one, e.g. a color filter first, then an image filter, and a video filter to finish it off. It isn't urgent, though, and just an idea.