SCNBook-code
SCNBook-code copied to clipboard
Unable to assign AVCaptureVideoPreviewLayer to SCNScene background contents
First of all - thank you for this awesome book! I teach objective-c to art students at The School of the Art Institute of Chicago and found your book to be a super helpful tool.
I'm trying to modify the example code on pages 63-64, in which you assign the camera input as a reflective material on the spacecraft, so that I can set the camera preview as a background to an SCNScene (I'm doing this as an iOS app, not a mac app).
I'm having no luck and am not sure what I'm doing wrong.The documentation clearly states that this is possible, and while I am able to get the video preview to work in a UIView, I am unable to get it working in my SCNScene. What am I doing wrong?
My code is below. I also posted the project on bitbucket. Setting the useCaptureView BOOL to YES demonstrates that the video preview code works in a UIView.
Any help would be very much appreciated!
//
// ViewController.m
//
// Created by Abraham Avnisan on 6/14/16.
// Copyright © 2016 Abraham Avnisan. All rights reserved.
//
#import "ViewController.h"
@import SceneKit;
@import AVFoundation;
@interface ViewController ()
@property (strong, nonatomic) SCNView *sceneView;
// AV Foundation Properties
@property (strong, nonatomic) AVCaptureSession *captureSession;
@property (strong, nonatomic) AVCaptureVideoPreviewLayer *captureLayer;
@property (strong, nonatomic) UIView *captureView;
@end
@implementation ViewController
- (void)setup
{
// create an SCNView and add it to the view
self.sceneView = [[SCNView alloc] initWithFrame:self.view.bounds options:nil];
SCNScene *scene = [[SCNScene alloc] init];
self.sceneView.scene = scene;
[self.view addSubview:self.sceneView];
// create a smaller UIView to test capture video layer
float height = self.view.bounds.size.height / 2.0;
float width = self.view.bounds.size.width / 2.0;
CGRect frame = CGRectMake(self.view.bounds.size.width / 2.0 - width / 2.0, self.view.bounds.size.height / 2.0 - height / 2.0, width, height);
self.captureView = [[UIView alloc] initWithFrame:frame];
[self.view addSubview:self.captureView];
// set up and start video preview
[self startCamera];
}
- (void)startCamera
{
AVCaptureDevice *camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (!camera) {
NSLog(@"ERROR - could not access camera");
return;
}
self.captureSession = [[AVCaptureSession alloc] init];
AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:nil];
[self.captureSession addInput:newVideoInput];
self.captureLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];
[self.captureLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
// Start the session. This is done asychronously since -startRunning doesn't return until the session is running.
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[self.captureSession startRunning];
});
BOOL useCaptureView = NO;
if (useCaptureView) {
self.captureLayer.frame = self.captureView.bounds;
[self.captureView.layer addSublayer:self.captureLayer];
self.sceneView.backgroundColor = [UIColor blueColor];
} else {
self.captureLayer.frame = self.sceneView.bounds;
self.sceneView.scene.background.contents = self.captureLayer;
self.captureView.backgroundColor = [UIColor redColor];
}
}
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
[self setup];
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
@end
I'm away and very busy at work this week. Let me get back to you sometime in the beginning of next week.
Thanks @d-ronnqvist, that would be very much appreciated!!
This might be related to this: http://stackoverflow.com/questions/29805632/using-mpmovieplayercontroller-as-texture-in-scenekit I've personnally not been able to use an AVPlayerLayer for a similar usage