在新iOS的音乐应用程序中,我们可以看到一个模糊的视图后面有一个专辑封面。

这样的事情是如何完成的呢?我看了文件,但什么都没发现。


当前回答

如果为tableView添加一个暗模糊视图,这将漂亮地使它:

tableView.backgroundColor = .clear
let blurEffect = UIBlurEffect(style: .dark)
let blurEffectView = UIVisualEffectView(effect: blurEffect)
blurEffectView.frame = tableView.bounds
blurEffectView.autoresizingMask = [.flexibleHeight, .flexibleWidth]


// Assigning blurEffectView to backgroundView instead of addSubview to tableView makes tableView cell not blocked by blurEffectView 
tableView.backgroundView = blurEffectView

其他回答

objective - c

UIVisualEffect *blurEffect;
blurEffect = [UIBlurEffect effectWithStyle:UIBlurEffectStyleDark];
UIVisualEffectView *visualEffectView;
visualEffectView = [[UIVisualEffectView alloc] initWithEffect:blurEffect];
visualEffectView.frame = self.accessImageView.bounds;
[self.accessImageView addSubview:visualEffectView];

斯威夫特3.0

let blurEffect = UIBlurEffect(style: UIBlurEffectStyle.dark)
let blurEffectView = UIVisualEffectView(effect: blurEffect)
blurEffectView.frame = view.bounds
blurEffectView.autoresizingMask = [.flexibleWidth, .flexibleHeight]
view.addSubview(blurEffectView)

来源:https://stackoverflow.com/a/24083728/4020910

自定义模糊比例

你可以尝试UIVisualEffectView自定义设置为-

class BlurViewController: UIViewController {
    private let blurEffect = (NSClassFromString("_UICustomBlurEffect") as! UIBlurEffect.Type).init()

    override func viewDidLoad() {
        super.viewDidLoad()
        let blurView = UIVisualEffectView(frame: UIScreen.main.bounds)
        blurEffect.setValue(1, forKeyPath: "blurRadius")
        blurView.effect = blurEffect
        view.addSubview(blurView)
    }   
}

输出:- for blureeffect . setvalue(1…& blurEffect.setValue (2 . .

核心形象

由于截图中的图像是静态的,你可以使用Core image中的CIGaussianBlur(需要iOS 6)。以下是示例:https://github.com/evanwdavis/Fun-with-Masks/blob/master/Fun%20with%20Masks/EWDBlurExampleVC.m

注意,这比本页上的其他选项要慢。

#import <QuartzCore/QuartzCore.h>

- (UIImage*) blur:(UIImage*)theImage
{   
    // ***********If you need re-orienting (e.g. trying to blur a photo taken from the device camera front facing camera in portrait mode)
    // theImage = [self reOrientIfNeeded:theImage];

    // create our blurred image
    CIContext *context = [CIContext contextWithOptions:nil];
    CIImage *inputImage = [CIImage imageWithCGImage:theImage.CGImage];

    // setting up Gaussian Blur (we could use one of many filters offered by Core Image)
    CIFilter *filter = [CIFilter filterWithName:@"CIGaussianBlur"];
    [filter setValue:inputImage forKey:kCIInputImageKey];
    [filter setValue:[NSNumber numberWithFloat:15.0f] forKey:@"inputRadius"];
    CIImage *result = [filter valueForKey:kCIOutputImageKey];

    // CIGaussianBlur has a tendency to shrink the image a little, 
    // this ensures it matches up exactly to the bounds of our original image
    CGImageRef cgImage = [context createCGImage:result fromRect:[inputImage extent]];

    UIImage *returnImage = [UIImage imageWithCGImage:cgImage];//create a UIImage for this function to "return" so that ARC can manage the memory of the blur... ARC can't manage CGImageRefs so we need to release it before this function "returns" and ends.
    CGImageRelease(cgImage);//release CGImageRef because ARC doesn't manage this on its own.

    return returnImage;

    // *************** if you need scaling
    // return [[self class] scaleIfNeeded:cgImage];
}

+(UIImage*) scaleIfNeeded:(CGImageRef)cgimg {
    bool isRetina = [[[UIDevice currentDevice] systemVersion] intValue] >= 4 && [[UIScreen mainScreen] scale] == 2.0;
    if (isRetina) {
        return [UIImage imageWithCGImage:cgimg scale:2.0 orientation:UIImageOrientationUp];
    } else {
        return [UIImage imageWithCGImage:cgimg];
    }
}

- (UIImage*) reOrientIfNeeded:(UIImage*)theImage{

    if (theImage.imageOrientation != UIImageOrientationUp) {

        CGAffineTransform reOrient = CGAffineTransformIdentity;
        switch (theImage.imageOrientation) {
            case UIImageOrientationDown:
            case UIImageOrientationDownMirrored:
                reOrient = CGAffineTransformTranslate(reOrient, theImage.size.width, theImage.size.height);
                reOrient = CGAffineTransformRotate(reOrient, M_PI);
                break;
            case UIImageOrientationLeft:
            case UIImageOrientationLeftMirrored:
                reOrient = CGAffineTransformTranslate(reOrient, theImage.size.width, 0);
                reOrient = CGAffineTransformRotate(reOrient, M_PI_2);
                break;
            case UIImageOrientationRight:
            case UIImageOrientationRightMirrored:
                reOrient = CGAffineTransformTranslate(reOrient, 0, theImage.size.height);
                reOrient = CGAffineTransformRotate(reOrient, -M_PI_2);
                break;
            case UIImageOrientationUp:
            case UIImageOrientationUpMirrored:
                break;
        }

        switch (theImage.imageOrientation) {
            case UIImageOrientationUpMirrored:
            case UIImageOrientationDownMirrored:
                reOrient = CGAffineTransformTranslate(reOrient, theImage.size.width, 0);
                reOrient = CGAffineTransformScale(reOrient, -1, 1);
                break;
            case UIImageOrientationLeftMirrored:
            case UIImageOrientationRightMirrored:
                reOrient = CGAffineTransformTranslate(reOrient, theImage.size.height, 0);
                reOrient = CGAffineTransformScale(reOrient, -1, 1);
                break;
            case UIImageOrientationUp:
            case UIImageOrientationDown:
            case UIImageOrientationLeft:
            case UIImageOrientationRight:
                break;
        }

        CGContextRef myContext = CGBitmapContextCreate(NULL, theImage.size.width, theImage.size.height, CGImageGetBitsPerComponent(theImage.CGImage), 0, CGImageGetColorSpace(theImage.CGImage), CGImageGetBitmapInfo(theImage.CGImage));

        CGContextConcatCTM(myContext, reOrient);

        switch (theImage.imageOrientation) {
            case UIImageOrientationLeft:
            case UIImageOrientationLeftMirrored:
            case UIImageOrientationRight:
            case UIImageOrientationRightMirrored:
                CGContextDrawImage(myContext, CGRectMake(0,0,theImage.size.height,theImage.size.width), theImage.CGImage);
                break;

            default:
                CGContextDrawImage(myContext, CGRectMake(0,0,theImage.size.width,theImage.size.height), theImage.CGImage);
                break;
        }

        CGImageRef CGImg = CGBitmapContextCreateImage(myContext);
        theImage = [UIImage imageWithCGImage:CGImg];

        CGImageRelease(CGImg);
        CGContextRelease(myContext);
    }

    return theImage;
}

叠加模糊(方块+高斯)

这实现了盒子和高斯模糊的混合。比非加速高斯快7倍,但不像盒子模糊那么难看。请在这里(Java插件版本)或这里(JavaScript版本)查看演示。该算法用于KDE和Camera+等。它不使用加速框架,但速度很快。

加速框架

In the session “Implementing Engaging UI on iOS” from WWDC 2013 Apple explains how to create a blurred background (at 14:30), and mentions a method applyLightEffect implemented in the sample code using Accelerate.framework. GPUImage uses OpenGL shaders to create dynamic blurs. It has several types of blur: GPUImageBoxBlurFilter, GPUImageFastBlurFilter, GaussianSelectiveBlur, GPUImageGaussianBlurFilter. There is even a GPUImageiOSBlurFilter that “should fully replicate the blur effect provided by iOS 7's control panel” (tweet, article). The article is detailed and informative.

    -(UIImage *)blurryGPUImage:(UIImage *)image withBlurLevel:(NSInteger)blur {
        GPUImageFastBlurFilter *blurFilter = [GPUImageFastBlurFilter new];
        blurFilter.blurSize = blur;
        UIImage *result = [blurFilter imageByFilteringImage:image];
        return result;
    }

From indieambitions.com: Perform a blur using vImage. The algorithm is also used in iOS-RealTimeBlur. From Nick Lockwood: https://github.com/nicklockwood/FXBlurView The example shows the blur over a scroll view. It blurs with dispatch_async, then syncs to call updates with UITrackingRunLoopMode so the blur is not lagged when UIKit gives more priority to the scroll of the UIScrollView. This is explained in Nick's book iOS Core Animation, which btw it's great. iOS-blur This takes the blurring layer of the UIToolbar and puts it elsewhere. Apple will reject your app if you use this method. See https://github.com/mochidev/MDBlurView/issues/4 From Evadne blog: LiveFrost: Fast, Synchronous UIView Snapshot Convolving. Great code and a great read. Some ideas from this post: Use a serial queue to throttle updates from CADisplayLink. Reuse bitmap contexts unless bounds change. Draw smaller images using -[CALayer renderInContext:] with a 0.5f scale factor.

其他东西

安迪·马图斯查克在推特上说:“你知道,在很多地方,我们看起来是实时的,但它是静态的,用的是聪明的技巧。”

在doubleencore.com网站上,他们说:“我们发现,在大多数情况下,10点模糊半径加上10点饱和度的增加最能模拟iOS 7的模糊效果。”

苹果sbfproceuralwallpaperview的私有头文件。

最后,这不是一个真正的模糊,但记住你可以设置rasterizationScale来获得一个像素化的图像:http://www.dimzzy.com/blog/2010/11/blur-effect-for-uiview/

如果为tableView添加一个暗模糊视图,这将漂亮地使它:

tableView.backgroundColor = .clear
let blurEffect = UIBlurEffect(style: .dark)
let blurEffectView = UIVisualEffectView(effect: blurEffect)
blurEffectView.frame = tableView.bounds
blurEffectView.autoresizingMask = [.flexibleHeight, .flexibleWidth]


// Assigning blurEffectView to backgroundView instead of addSubview to tableView makes tableView cell not blocked by blurEffectView 
tableView.backgroundView = blurEffectView

我不认为我被允许发布代码,但上面提到的WWDC示例代码是正确的。链接如下:https://developer.apple.com/downloads/index.action?name=WWDC%202013

你要找的文件是UIImage上的category,方法是applyLightEffect。

正如我在上面的评论中提到的,除了模糊,苹果模糊还有饱和度和其他东西。简单的模糊是不行的……如果你想模仿他们的风格。