尝试使用AVFoundation AVAssetWriter将Metal框架写入Quicktime文件时的所有黑框

时间:2018-05-31 22:56:02

标签: swift avfoundation metal

我正在使用这个Swift类(原来在这个问题的答案中显示: Capture Metal MTKView as Movie in realtime?)尝试将我的Metal app框架录制到电影文件中。

class MetalVideoRecorder {
    var isRecording = false
    var recordingStartTime = TimeInterval(0)

    private var assetWriter: AVAssetWriter
    private var assetWriterVideoInput: AVAssetWriterInput
    private var assetWriterPixelBufferInput: AVAssetWriterInputPixelBufferAdaptor

    init?(outputURL url: URL, size: CGSize) {
        do {
            assetWriter = try AVAssetWriter(outputURL: url, fileType: AVFileTypeAppleM4V)
        } catch {
            return nil
        }

        let outputSettings: [String: Any] = [ AVVideoCodecKey : AVVideoCodecH264,
            AVVideoWidthKey : size.width,
            AVVideoHeightKey : size.height ]

        assetWriterVideoInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: outputSettings)
        assetWriterVideoInput.expectsMediaDataInRealTime = true

        let sourcePixelBufferAttributes: [String: Any] = [
            kCVPixelBufferPixelFormatTypeKey as String : kCVPixelFormatType_32BGRA,
            kCVPixelBufferWidthKey as String : size.width,
            kCVPixelBufferHeightKey as String : size.height ]

        assetWriterPixelBufferInput = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: assetWriterVideoInput,
                                                                           sourcePixelBufferAttributes: sourcePixelBufferAttributes)

        assetWriter.add(assetWriterVideoInput)
    }

    func startRecording() {
        assetWriter.startWriting()
        assetWriter.startSession(atSourceTime: kCMTimeZero)

        recordingStartTime = CACurrentMediaTime()
        isRecording = true
    }

    func endRecording(_ completionHandler: @escaping () -> ()) {
        isRecording = false

        assetWriterVideoInput.markAsFinished()
        assetWriter.finishWriting(completionHandler: completionHandler)
    }

    func writeFrame(forTexture texture: MTLTexture) {
        if !isRecording {
            return
        }

        while !assetWriterVideoInput.isReadyForMoreMediaData {}

        guard let pixelBufferPool = assetWriterPixelBufferInput.pixelBufferPool else {
            print("Pixel buffer asset writer input did not have a pixel buffer pool available; cannot retrieve frame")
            return
        }

        var maybePixelBuffer: CVPixelBuffer? = nil
        let status  = CVPixelBufferPoolCreatePixelBuffer(nil, pixelBufferPool, &maybePixelBuffer)
        if status != kCVReturnSuccess {
            print("Could not get pixel buffer from asset writer input; dropping frame...")
            return
        }

        guard let pixelBuffer = maybePixelBuffer else { return }

        CVPixelBufferLockBaseAddress(pixelBuffer, [])
        let pixelBufferBytes = CVPixelBufferGetBaseAddress(pixelBuffer)!

        // Use the bytes per row value from the pixel buffer since its stride may be rounded up to be 16-byte aligned
        let bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer)
        let region = MTLRegionMake2D(0, 0, texture.width, texture.height)

        texture.getBytes(pixelBufferBytes, bytesPerRow: bytesPerRow, from: region, mipmapLevel: 0)

        let frameTime = CACurrentMediaTime() - recordingStartTime
        let presentationTime = CMTimeMakeWithSeconds(frameTime, 240)
        assetWriterPixelBufferInput.append(pixelBuffer, withPresentationTime: presentationTime)

        CVPixelBufferUnlockBaseAddress(pixelBuffer, [])
    }
}

我没有看到任何错误,但生成的Quicktime文件中的帧都是黑色的。帧的大小正确,我的像素格式正确(bgra8Unorm)。任何人都知道为什么它可能不起作用?

我在呈现之前调用writeFrame函数并提交当前drawable,如下所示:

        if let drawable = view.currentDrawable {

            if BigVideoWriter != nil && BigVideoWriter!.isRecording {
                commandBuffer.addCompletedHandler { commandBuffer in
                    BigVideoWriter?.writeFrame(forTexture: drawable.texture)
                }
            }

            commandBuffer.present(drawable)
            commandBuffer.commit()      
        }

我最初得到了一个错误,我的MetalKitView图层是' framebufferOnly'。所以我在尝试录制之前将其设置为false。摆脱了错误,但帧都是黑色的。我也尝试在程序的最开始将它设置为false,但我得到了相同的结果。

我也尝试过使用' addCompletedHandler'而不是' addScheduledHandler',但这给了我错误" [CAMetalLayerDrawable texture]不应该在已经呈现这个drawable之后被调用。改为获得nextDrawable。 "

感谢您的任何建议!

编辑:我在@Idogy的帮助下解决了这个问题。测试显示原始版本适用于iOS但不适用于Mac。他说,因为我有一个NVIDIA GPU,所以帧缓冲区是私有的。所以我不得不在纹理上添加一个带有同步调用的blitCommandEncoder,然后它就开始工作了。像这样:

   if let drawable = view.currentDrawable {

        if BigVideoWriter != nil && BigVideoWriter!.isRecording {
 #if ISMAC
            if let blitCommandEncoder = commandBuffer.makeBlitCommandEncoder() {
                blitCommandEncoder.synchronize(resource: drawable.texture)
                blitCommandEncoder.endEncoding()
            }
 #endif
            commandBuffer.addCompletedHandler { commandBuffer in
                BigVideoWriter?.writeFrame(forTexture: drawable.texture)
            }
        }

        commandBuffer.present(drawable)
        commandBuffer.commit()    
    }

1 个答案:

答案 0 :(得分:2)

我相信你过早地编写你的帧 - 通过在你的渲染循环中调用writeFrame,你实际上是在它仍然是空的时候捕获drawable(GPU只是没有&#t; t渲染它。)

请记住,在致电commmandBuffer.commit()之前,GPU甚至开始渲染你的框架。在尝试抓取结果帧之前,您需要等待GPU完成渲染。该序列有点令人困惑,因为您在调用present()之前还调用commit(),但这不是运行时的实际操作顺序。那个present调用只是告诉Metal安排一次调用,一旦GPU完成呈现,就将你的帧呈现给屏幕

您应该在完成处理程序中调用writeFrame(使用commandBuffer.addCompletedHandler())。那应该照顾好这个。

更新:虽然上面的答案是正确的,但它只是部分的。由于OP使用带有私有VRAM的独立GPU,因此CPU无法看到渲染目标像素。该问题的解决方案是添加MTLBlitCommandEncoder,并使用synchronize()方法确保渲染的像素从GPU的VRAM复制回RAM。