有没有更快的方法从Swift中的UIImage创建CVPixelBuffer?

时间:2018-01-31 21:35:47

标签: swift video uiimage buffer cvpixelbuffer

任务:实时录制视频并应用过滤器

问题从修改过的UIImage获取CVPixelBuffer太慢了

我的相机的输出正在被过滤并直接进入UIImageView,这样即使不录制视频或拍照,用户也可以实时看到效果。我希望以某种方式将这种不断变化的UIImage记录到视频中,因此它不需要像我现在所做的那样。目前,我通过将CVPixelBuffer附加到assetWriter来实现,但由于我将过滤器应用于UIImage,我将UIImage转换回缓冲区。我使用和不使用UIImage进行了测试 - >缓冲,所以我已经证明导致不可接受的减速。

下面是captureOutput中的代码,注释要清楚发生了什么,以及获取UIImage缓冲区的方法:

// this function is called to output the device's camera output in realtime
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection){

        if(captureOutput){

            // create ciImage from buffer
            let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
            let cameraImage = CIImage(cvPixelBuffer: pixelBuffer!)

            // set UIImage to ciImage
            image = UIImage(ciImage: cameraImage)


            if let ciImage = image?.ciImage {

                // apply filter to CIImage
                image = filterCIImage(with:ciImage)

                // make CGImage and apply orientation
                image = UIImage(cgImage: (image?.cgImage)!, scale: 1.0, orientation: UIImageOrientation.right)

                // get format description, dimensions and current sample time
                let formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer)!
                self.currentVideoDimensions = CMVideoFormatDescriptionGetDimensions(formatDescription)
                self.currentSampleTime = CMSampleBufferGetOutputPresentationTimeStamp(sampleBuffer)

                // check if user toggled video recording 
                // and asset writer is ready
                if(videoIsRecording && self.assetWriterPixelBufferInput?.assetWriterInput.isReadyForMoreMediaData == true){
                    // get pixel buffer from UIImage - SLOW!
                    let filteredBuffer = buffer(from: image!)

                    // append the buffer to the asset writer
                    let success = self.assetWriterPixelBufferInput?.append(filteredBuffer!, withPresentationTime: self.currentSampleTime!)

                    if success == false {
                        print("Pixel Buffer failed")
                    }

                }

            }


            DispatchQueue.main.async(){
                // update UIImageView with filtered camera output
                imageView!.image = image
            }

        }

    }


    // UIImage to buffer method:
    func buffer(from image: UIImage) -> CVPixelBuffer? {
        let attrs = [kCVPixelBufferCGImageCompatibilityKey: kCFBooleanTrue, kCVPixelBufferCGBitmapContextCompatibilityKey: kCFBooleanTrue] as CFDictionary
        var pixelBuffer : CVPixelBuffer?
        let status = CVPixelBufferCreate(kCFAllocatorDefault, Int(image.size.width), Int(image.size.height), kCVPixelFormatType_32ARGB, attrs, &pixelBuffer)
        guard (status == kCVReturnSuccess) else {
            return nil
        }

        CVPixelBufferLockBaseAddress(pixelBuffer!, CVPixelBufferLockFlags(rawValue: 0))
        let pixelData = CVPixelBufferGetBaseAddress(pixelBuffer!)

        let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
        let context = CGContext(data: pixelData, width: Int(image.size.width), height: Int(image.size.height), bitsPerComponent: 8, bytesPerRow: CVPixelBufferGetBytesPerRow(pixelBuffer!), space: rgbColorSpace, bitmapInfo: CGImageAlphaInfo.noneSkipFirst.rawValue)

        context?.translateBy(x: 0, y: image.size.height)
        context?.scaleBy(x: 1.0, y: -1.0)

        UIGraphicsPushContext(context!)
        image.draw(in: CGRect(x: 0, y: 0, width: image.size.width, height: image.size.height))
        UIGraphicsPopContext()
        CVPixelBufferUnlockBaseAddress(pixelBuffer!, CVPixelBufferLockFlags(rawValue: 0))

        return pixelBuffer
    }

0 个答案:

没有答案