使用带有MTLTexture的CARenderer在屏幕外渲染动画CALayer

时间:2019-05-15 13:17:12

标签: rendering metal core-image quartz-core off-screen

我想将动画NSView(或仅底层CALayer)呈现为一系列图像,而视图不会显示在屏幕上全部。我已经知道如何使用CARendererMTLTexture来做到这一点,但是下面的方法存在一些问题。

它在操场上运行,并将输出存储到下载的Off-screen Render文件夹中:

import AppKit
import Metal
import QuartzCore
import PlaygroundSupport

let view = NSView(frame: CGRect(x: 0, y: 0, width: 600, height: 400))
let circle = NSView(frame: CGRect(x: 0, y: 0, width: 50, height: 50))

circle.wantsLayer = true
circle.layer?.backgroundColor = NSColor.red.cgColor
circle.layer?.cornerRadius = 25
view.wantsLayer = true
view.addSubview(circle)

let textureDescriptor = MTLTextureDescriptor.texture2DDescriptor(pixelFormat: .rgba8Unorm, width: 600, height: 400, mipmapped: false)
textureDescriptor.usage = [MTLTextureUsage.shaderRead, .shaderWrite, .renderTarget]

let device = MTLCreateSystemDefaultDevice()!
let texture: MTLTexture = device.makeTexture(descriptor: textureDescriptor)!
let context = CIContext(mtlDevice: device)
let renderer = CARenderer(mtlTexture: texture)

renderer.layer = view.layer
renderer.bounds = view.frame

let outputURL: URL = try! FileManager.default.url(for: .downloadsDirectory, in: .userDomainMask, appropriateFor: nil, create: false).appendingPathComponent("Off-screen Render")
try? FileManager.default.removeItem(at: outputURL)
try! FileManager.default.createDirectory(at: outputURL, withIntermediateDirectories: true, attributes: nil)

var frameNumber: Int = 0

func render() {
    Swift.print("Rendering frame #\(frameNumber)…")

    renderer.beginFrame(atTime: CACurrentMediaTime(), timeStamp: nil)
    renderer.addUpdate(renderer.bounds)
    renderer.render()
    renderer.endFrame()

    let ciImage: CIImage = CIImage(mtlTexture: texture)!
    let cgImage: CGImage = context.createCGImage(ciImage, from: ciImage.extent)!
    let url: URL = outputURL.appendingPathComponent("frame-\(frameNumber).png")
    let destination: CGImageDestination = CGImageDestinationCreateWithURL(url as CFURL, kUTTypePNG, 1, nil)!
    CGImageDestinationAddImage(destination, cgImage, nil)
    guard CGImageDestinationFinalize(destination) else { fatalError() }

    frameNumber += 1
}

var timer: Timer?

NSAnimationContext.runAnimationGroup({ context in
    context.duration = 0.25
    view.animator().frame.origin = CGPoint(x: 550, y: 350)
}, completionHandler: {
    timer?.invalidate()
    render()
    Swift.print("Finished off-screen rendering of \(frameNumber) frames in \(outputURL.path)…")
})

// Make the first render immediately after the animation start and after it completes. For the purpose
// of this demo timer is used instead of display link.

render()
timer = Timer.scheduledTimer(withTimeInterval: 1 / 30, repeats: true, block: { _ in render() })

上述代码的问题显示在以下附件中,并且是:

  1. 纹理不会被清洁,并且下一帧会在前一个渲染的顶部绘制。我知道我可以使用replace(region:…),但怀疑与具有清晰颜色描述的渲染过程相比效率不高。这是真的?渲染通道可以与CARenderer一起使用吗?

  2. 第一帧(在实际项目中为两到三帧)通常是空的。我怀疑这与CARenderer渲染或使用Core Image构建CGImage期间的某些异步行为有关。如何避免这种情况?在纹理上是否有某种等待直到渲染完成的回调?

enter image description here

2 个答案:

答案 0 :(得分:2)

与Apple Developer技术支持人员交谈后,看来:

Core Image推迟渲染,直到客户端请求访问帧缓冲区,即CVPixelBufferLockBaseAddress

因此,解决方案就是在调用CVPixelBufferLockBaseAddress之后执行CIContext.render,如下所示:

for frameNumber in 0 ..< frameCount {
    var pixelBuffer: CVPixelBuffer?
    guard let pixelBufferPool: CVPixelBufferPool = pixelBufferAdaptor.pixelBufferPool else { preconditionFailure() }
    precondition(CVPixelBufferPoolCreatePixelBuffer(nil, pixelBufferPool, &pixelBuffer) == kCVReturnSuccess)

    let ciImage = CIImage(cgImage: frameImage)
    context.render(ciImage, to: pixelBuffer!)

    precondition(CVPixelBufferLockBaseAddress(pixelBuffer!, []) == kCVReturnSuccess)
    defer { precondition(CVPixelBufferUnlockBaseAddress(pixelBuffer!, []) == kCVReturnSuccess) }

    let bytes = UnsafeBufferPointer(start: CVPixelBufferGetBaseAddress(pixelBuffer!)!.assumingMemoryBound(to: UInt8.self), count: CVPixelBufferGetDataSize(pixelBuffer!))
    precondition(bytes.contains(where: { $0 != 0 }))

    while !input.isReadyForMoreMediaData { Thread.sleep(forTimeInterval: 10 / 1000) }
    precondition(pixelBufferAdaptor.append(pixelBuffer!, withPresentationTime: CMTime(seconds: Double(frameNumber) * frameRate, preferredTimescale: 600)))
}

P.S。这与Making CIContext.render(CIImage, CVPixelBuffer) work with AVAssetWriter问题的答案是相同的-您可能需要检查一下以更深入地了解在使用AVFoundation时此问题可能在何处以及如何发生。但是,问题不同,解决方案完全相同。

答案 1 :(得分:1)

我认为您可以使用AVVideoCompositionCoreAnimationTool来渲染带有动画的视图。

相关问题