Feature Request: Taking bigger rect into account when blurring
I've noticed a difference between this, and apple's implementation of the blur effect. Apple is probably blurring a bigger rect than the .bounds of the frostview, and then clipping it to the size of blur view's frame. When for example scrolling, behind a UIToolbar, you can see the back views blurred before they're actually intersecting the toolbar frame.
I would try something like: rect = CGRectMake(rect.origin.x - blurRadius, rect.origin.y - blurRadius, rect.size.width + blurRadius * 2.0, rect.size.height + blurRadius * 2.0); //not going out of the window
Are you seing this possible ? :)
If you're curious what Apple's doing, at least from an input sense, you can take a look at WARNING! CKBlurView's implementation. USES PRIVATE APIS! DO NOT SHIP APPS WITH THIS!
It uses a private API called CAFilter. That has two interesting options, inputBounds and inputHardEdges. You could try fooling around with those using an instance of CKBlurView to see exactly what matches the bits you're seeing from UIToolbar, to get a better sense of what Apple's doing. Don't ship this or any code that uses CAFilter in anything, just play with it in some demo code that won't ever see the light of day in the App Store. You've been warned.
Is that possible? Sure. A custom bounds for the source layer looks like it would be a 2.0 feature, since there's already a custom center for the source layer planned (see #13).
Might as well make it inputFrame to handle both problems.
Thank you for sharing the link, I will definitely look into it. Yes, inputFrame would be perfect, and maybe with the default values including some more space :)
2 * blur radius?
2 * blurRadius would be a perfect value in my opinion.