我正在尝试将CMSampleBuffer
从相机输出转换为vImage
,然后应用一些处理。不幸的是,即使没有任何进一步的编辑,我从缓冲区得到的帧也有错误的颜色:
实施(不考虑内存管理和错误):
配置视频输出设备:
videoDataOutput = AVCaptureVideoDataOutput()
videoDataOutput.videoSettings = [String(kCVPixelBufferPixelFormatTypeKey): kCVPixelFormatType_32BGRA]
videoDataOutput.alwaysDiscardsLateVideoFrames = true
videoDataOutput.setSampleBufferDelegate(self, queue: captureQueue)
videoConnection = videoDataOutput.connection(withMediaType: AVMediaTypeVideo)
captureSession.sessionPreset = AVCaptureSessionPreset1280x720
let videoDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
guard let videoDeviceInput = try? AVCaptureDeviceInput(device: videoDevice) else {
return
}
从相机收到的vImage
创建CASampleBuffer
:
// Convert `CASampleBuffer` to `CVImageBuffer`
guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
var buffer: vImage_Buffer = vImage_Buffer()
buffer.data = CVPixelBufferGetBaseAddress(pixelBuffer)
buffer.rowBytes = CVPixelBufferGetBytesPerRow(pixelBuffer)
buffer.width = vImagePixelCount(CVPixelBufferGetWidth(pixelBuffer))
buffer.height = vImagePixelCount(CVPixelBufferGetHeight(pixelBuffer))
let vformat = vImageCVImageFormat_CreateWithCVPixelBuffer(pixelBuffer)
let bitmapInfo:CGBitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.last.rawValue | CGBitmapInfo.byteOrder32Little.rawValue)
var cgFormat = vImage_CGImageFormat(bitsPerComponent: 8,
bitsPerPixel: 32,
colorSpace: nil,
bitmapInfo: bitmapInfo,
version: 0,
decode: nil,
renderingIntent: .defaultIntent)
// Create vImage
vImageBuffer_InitWithCVPixelBuffer(&buffer, &cgFormat, pixelBuffer, vformat!.takeRetainedValue(), cgColor, vImage_Flags(kvImageNoFlags))
将缓冲区转换为UIImage:
为了测试,CVPixelBuffer被导出到UIImage,但是将它添加到视频缓冲区也有相同的结果。
var dstPixelBuffer: CVPixelBuffer?
let status = CVPixelBufferCreateWithBytes(nil, Int(buffer.width), Int(buffer.height),
kCVPixelFormatType_32BGRA, buffer.data,
Int(buffer.rowBytes), releaseCallback,
nil, nil, &dstPixelBuffer)
let destCGImage = vImageCreateCGImageFromBuffer(&buffer, &cgFormat, nil, nil, numericCast(kvImageNoFlags), nil)?.takeRetainedValue()
// create a UIImage
let exportedImage = destCGImage.flatMap { UIImage(cgImage: $0, scale: 0.0, orientation: UIImageOrientation.right) }
DispatchQueue.main.async {
self.previewView.image = exportedImage
}
答案 0 :(得分:1)
尝试在CV图像格式上设置色彩空间:
let vformat = vImageCVImageFormat_CreateWithCVPixelBuffer(pixelBuffer).takeRetainedValue()
vImageCVImageFormat_SetColorSpace(vformat,
CGColorSpaceCreateDeviceRGB())
...并更新您对vImageBuffer_InitWithCVPixelBuffer
的来电,以反映vformat
现在是管理参考的事实:
let error = vImageBuffer_InitWithCVPixelBuffer(&buffer, &cgFormat, pixelBuffer, vformat, nil, vImage_Flags(kvImageNoFlags))
最后,您可以删除以下行,vImageBuffer_InitWithCVPixelBuffer
正在为您完成这项工作:
// buffer.data = CVPixelBufferGetBaseAddress(pixelBuffer)
// buffer.rowBytes = CVPixelBufferGetBytesPerRow(pixelBuffer)
// buffer.width = vImagePixelCount(CVPixelBufferGetWidth(pixelBuffer))
// buffer.height = vImagePixelCount(CVPixelBufferGetHeight(pixelBuffer))
请注意,您不需要锁定核心视频像素缓冲区,如果检查headerdoc,它会说“在调用此函数之前没有必要锁定CVPixelBuffer”。
答案 1 :(得分:0)
对vImageBuffer_InitWithCVPixelBuffer
的调用正在执行修改vImage_Buffer
和CVPixelBuffer
的内容,这有点顽皮,因为在您的(链接)代码中,您保证不会修改像素你说
CVPixelBufferLockBaseAddress(pixelBuffer, .readOnly)
初始化BGRA8888的CGBitmapInfo
的正确方法是alpha first,32bit little endian,这是非常明显的,但在vImage_Utilities.h中vImage_CGImageFormat
的头文件中有所涵盖:
let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.first.rawValue | CGImageByteOrderInfo.order32Little.rawValue)
我没有得到的是为什么vImageBuffer_InitWithCVPixelBuffer
正在修改您的缓冲区,因为cgFormat
(desiredFormat
)应匹配vformat
,尽管记录了修改缓冲区,所以也许你应该先复制数据。