我提供了pixelbuffer,我需要从lf.swift库附加到rtmpStream对象以将其流式传输到youtube。它看起来像这样:rtmpStream.appendSampleBuffer(sampleBuffer: CMSampleBuffer, withType: CMSampleBufferType)
所以,我需要以某种方式将CVPixelbuffer转换为CMSampleBuffer以附加到rtmpStream。
var sampleBuffer: CMSampleBuffer? = nil
var sampleTimingInfo: CMSampleTimingInfo = kCMTimingInfoInvalid
sampleTimingInfo.presentationTimeStamp = presentationTime
var formatDesc: CMVideoFormatDescription? = nil
_ = CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, &formatDesc)
if let formatDesc = formatDesc {
CMSampleBufferCreateReadyWithImageBuffer(kCFAllocatorDefault, pixelBuffer, formatDesc, &sampleTimingInfo, &sampleBuffer)
}
if let sampleBuffer = sampleBuffer {
self.rtmpStream.appendSampleBuffer(sampleBuffer, withType: CMSampleBufferType.video)
}
但遗憾的是,这不起作用。流媒体库经过测试,在我流式摄像机输入或screenCapture时工作正常。我认为问题可能是sampleTimingInfo,因为它需要decodeTime和Duration,我不知道如何获得提供的CVPixelBuffer。