我正在编写一个应用程序,需要对使用AVCaptureSession捕获的视频应用过滤器。过滤后的输出将写入输出文件。我目前正在使用CIFilter和CIImage来过滤每个视频帧。 这是代码:
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
...
let pixelBuffer = CMSampleBufferGetImageBuffer(samples)!
let options = [kCVPixelBufferPixelFormatTypeKey as String : kCVPixelFormatType_420YpCbCr8BiPlanarFullRange]
let cameraImage = CIImage(cvImageBuffer: pixelBuffer, options: options)
let filter = CIFilter(name: "CIGaussianBlur")!
filter.setValue((70.0), forKey: kCIInputRadiusKey)
filter.setValue(cameraImage, forKey: kCIInputImageKey)
let result = filter.outputImage!
var pixBuffer:CVPixelBuffer? = nil;
let fmt = CVPixelBufferGetPixelFormatType(pixelBuffer)
CVPixelBufferCreate(kCFAllocatorSystemDefault,
CVPixelBufferGetWidth(pixelBuffer),
CVPixelBufferGetHeight(pixelBuffer),
fmt,
CVBufferGetAttachments(pixelBuffer, .shouldPropagate),
&pixBuffer);
CVBufferPropagateAttachments(pixelBuffer, pixBuffer!)
let eaglContext = EAGLContext(api: EAGLRenderingAPI.openGLES3)!
eaglContext.isMultiThreaded = true
let contextOptions = [kCIContextWorkingColorSpace : NSNull(), kCIContextOutputColorSpace: NSNull()]
let context = CIContext(eaglContext: eaglContext, options: contextOptions)
CVPixelBufferLockBaseAddress( pixBuffer!, CVPixelBufferLockFlags(rawValue: 0))
context.render(result, to: pixBuffer!)
CVPixelBufferUnlockBaseAddress( pixBuffer!, CVPixelBufferLockFlags(rawValue: 0))
var timeInfo = CMSampleTimingInfo(duration: sampleBuffer.duration,
presentationTimeStamp: sampleBuffer.presentationTimeStamp,
decodeTimeStamp: sampleBuffer.decodeTimeStamp)
var sampleBuf:CMSampleBuffer? = nil;
CMSampleBufferCreateReadyWithImageBuffer(kCFAllocatorDefault,
pixBuffer!,
samples.formatDescription!,
&timeInfo,
&sampleBuf)
// write to video file
let ret = assetWriterInput.append(sampleBuf!)
...
}
来自AVAssetWriterInput.append的ret始终为false。我在这做错了什么?而且,我使用的方法效率很低。在此过程中会创建一些临时副本。是否有可能就地?
答案 0 :(得分:0)
我使用了几乎相同的代码并遇到了同样的问题。我发现创建用于渲染的像素缓冲区有问题。 append(sampleBuffer:)
总是返回false,assetWriter.error
是
错误域= AVFoundationErrorDomain代码= -11800"操作可以 没完成" UserInfo = {NSUnderlyingError = 0x17024ba30 {错误 Domain = NSOSStatusErrorDomain Code = -12780"(null)"}, NSLocalizedFailureReason =发生未知错误(-12780), NSLocalizedDescription =无法完成操作}
他们说这是一个错误(如here所述),已发布:https://bugreport.apple.com/web/?problemID=34574848。
但是我意外地发现当使用原始像素缓冲区进行渲染时问题会消失。请参阅以下代码:
let sourcePixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
let sourceImage = CIImage(cvImageBuffer: sourcePixelBuffer)
let filter = CIFilter(name: "CIGaussianBlur", withInputParameters: [kCIInputImageKey: sourceImage])!
let filteredImage = filter.outputImage!
var pixelBuffer: CVPixelBuffer? = nil
let width = CVPixelBufferGetWidth(sourcePixelBuffer)
let height = CVPixelBufferGetHeight(sourcePixelBuffer)
let pixelFormat = CVPixelBufferGetPixelFormatType(sourcePixelBuffer)
let attributes = CVBufferGetAttachments(sourcePixelBuffer, .shouldPropagate)!
CVPixelBufferCreate(nil, width, height, pixelFormat, attributes, &pixelBuffer)
CVBufferPropagateAttachments(sourcePixelBuffer, pixelBuffer!)
var filteredPixelBuffer = pixelBuffer! // this never works
filteredPixelBuffer = sourcePixelBuffer // 0_0
let context = CIContext(options: [kCIContextOutputColorSpace: CGColorSpace(name: CGColorSpace.sRGB)!])
context.render(filteredImage, to: filteredPixelBuffer) // modifying original image buffer here!
let presentationTimestamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
var timing = CMSampleTimingInfo(duration: kCMTimeInvalid, presentationTimeStamp: presentationTimestamp, decodeTimeStamp: kCMTimeInvalid)
var processedSampleBuffer: CMSampleBuffer? = nil
var formatDescription: CMFormatDescription? = nil
CMVideoFormatDescriptionCreateForImageBuffer(nil, filteredPixelBuffer, &formatDescription)
CMSampleBufferCreateReadyWithImageBuffer(nil, filteredPixelBuffer, formatDescription!, &timing, &processedSampleBuffer)
print(assetInput!.append(processedSampleBuffer!))
当然,我们都知道你不允许修改样本缓冲区,但不知怎的,这种方法可以提供正常处理的视频。诡计很脏,我不能说如果你有预览图层或一些并发的处理程序就没问题。