CIContext + createCGImage + UIImage崩溃

时间:2016-10-30 18:49:08

标签: ios xcode uiimage cgcontext ciimage

我在做什么:

我从 AVFoundation 中的 didOutputSampleBuffer 中获取 CMSampleBuffer 并运行一些过滤器并将其输出到 UIImage 每次代表吐出一个缓冲区。

什么在起作用:

所有过滤器都可以正常工作。它给了我想要的输出。新手机(iPhone 6 / 6s / 7)上的一切运行良好;但是,在iPhone 5s上冻结了几秒

过滤器& UIImage输出:

let inputImage = self.bufferImage!

let filter = CIFilter(name: "CIPixellate")
let beginImage = inputImage
filter!.setValue(beginImage, forKey: kCIInputImageKey)

let filter3 = CIFilter(name: "CIColorMonochrome")
filter3!.setValue(filter!.outputImage, forKey: kCIInputImageKey)
filter3!.setValue(CIColor(red: 1, green:0, blue: 0), forKey: kCIInputColorKey)
filter3!.setValue(200.0, forKey: kCIInputIntensityKey)

let filter2 = CIFilter(name: "CIMultiplyBlendMode")
filter2!.setValue(filter3!.outputImage, forKey: kCIInputImageKey)
filter2!.setValue(inputImage, forKey: kCIInputBackgroundImageKey)
let output2 = filter2!.outputImage

let cgimg = self.context.createCGImage(output2!, fromRect: output2!.extent)
let newImage = UIImage(CGImage: cgimg!)
dispatch_sync(dispatch_get_main_queue()) {
     self.imageView?.image = newImage
}
self.context.clearCaches()

我将CIContext创建为:

let context = CIContext(options: nil)

我还尝试强制CIContext在硬件上渲染,反之亦然。

我觉得,有些东西正在耗尽内存/空间/泄漏/等等,但是当它冻结时,Xcode中没有错误,只有应用程序处于冻结状态。我最后添加了 self.context.clearCaches(),如果没有真正改变原始问题。

仅在低速设备上发生 - 在这种情况下为5S,在6 / 6s / 7上运行顺畅,没有任何问题。

我的完整 didOutputSampleBuffer 供参考:

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
    connection.videoOrientation = .Portrait
    let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)

    CVPixelBufferLockBaseAddress(imageBuffer!, CVPixelBufferLockFlags(rawValue: CVOptionFlags(0)))

    let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer!)

    let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer!)

    let width = CVPixelBufferGetWidth(imageBuffer!)
    let height = CVPixelBufferGetHeight(imageBuffer!)

    let colorSpace = CGColorSpaceCreateDeviceRGB()

    let bitmap = CGBitmapInfo(rawValue: CGBitmapInfo.ByteOrder32Little.rawValue|CGImageAlphaInfo.PremultipliedFirst.rawValue)
    let context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                        bytesPerRow, colorSpace, bitmap.rawValue)

    let quartzImage = CGBitmapContextCreateImage(context!)

    CVPixelBufferUnlockBaseAddress(imageBuffer!,CVPixelBufferLockFlags(rawValue: CVOptionFlags(0)))

    self.bufferImage = CIImage(CGImage: quartzImage!)

            let inputImage = self.bufferImage!

            let filter = CIFilter(name: "CIPixellate")
            let beginImage = inputImage
            filter!.setValue(beginImage, forKey: kCIInputImageKey)

            let filter3 = CIFilter(name: "CIColorMonochrome")
            filter3!.setValue(filter!.outputImage, forKey: kCIInputImageKey)
            filter3!.setValue(CIColor(red: 1, green:0, blue: 0), forKey: kCIInputColorKey)
            filter3!.setValue(200.0, forKey: kCIInputIntensityKey)

            let filter2 = CIFilter(name: "CIMultiplyBlendMode")
            filter2!.setValue(filter3!.outputImage, forKey: kCIInputImageKey)
            filter2!.setValue(inputImage, forKey: kCIInputBackgroundImageKey)
            let output2 = filter2!.outputImage

            let cgimg = self.context.createCGImage(output2!, fromRect: output2!.extent)
            let newImage = UIImage(CGImage: cgimg!)

            dispatch_async(dispatch_get_main_queue()) {
                self.imageView?.image = newImage
            }
            self.context.clearCaches()
}

更新

我能够通过改变让像素缓冲区成为CIImage的方法来解决冻结问题:

let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
self.bufferImage = CIImage(CVPixelBuffer: pixelBuffer)

在didOutputSampleBuffer的开头填写了大部分代码。

然而,现在 CPU使用率非常高! Xcode显示'Engergy Impact'一样高!

1 个答案:

答案 0 :(得分:1)

你在说:

dispatch_sync(dispatch_get_main_queue()) {
     self.imageView?.image = newImage
}

你没有理由等待这个电话的结果。请改用dispatch_async

(更好的是:找出你是否在主线程上。如果是,请不要使用dispatch任何东西。)