我正在尝试将sampleBuffer转换为UIImage并使用colorspaceGray在图像视图中显示它。但它显示为以下图像。我认为转换存在问题。如何转换CMSampleBuffer?
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
print("buffered")
let imageBuffer: CVImageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
CVPixelBufferLockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))
let width: Int = CVPixelBufferGetWidth(imageBuffer)
let height: Int = CVPixelBufferGetHeight(imageBuffer)
let bytesPerRow: Int = CVPixelBufferGetBytesPerRow(imageBuffer)
let lumaBuffer = CVPixelBufferGetBaseAddress(imageBuffer)
//let planeCount : Int = CVPixelBufferGetPlaneCount(imageBuffer)
let grayColorSpace: CGColorSpace = CGColorSpaceCreateDeviceGray()
let context: CGContext = CGContext(data: lumaBuffer, width: width, height: height, bitsPerComponent: 8, bytesPerRow: bytesPerRow , space: grayColorSpace, bitmapInfo: CGImageAlphaInfo.none.rawValue)!
let dstImageFilter: CGImage = context.makeImage()!
let imageRect : CGRect = CGRect(x: 0, y: 0, width: width, height: height)
context.draw(dstImageFilter, in: imageRect)
let image = UIImage(cgImage: dstImageFilter)
DispatchQueue.main.sync(execute: {() -> Void in
self.imageTest.image = image
})
}
答案 0 :(得分:32)
转换很简单:
@IBAction func BellAction(_ sender: Any) {
let searchVC = self.storyboard?.instantiateViewController(withIdentifier:StoryBoardIDs.Notification.rawValue)
self.navigationController?.pushViewController(searchVC!, animated: true)
}
答案 1 :(得分:1)
看起来CMSampleBuffer
正在为您提供RGBA数据,然后您可以直接构建灰度图像。您将需要构建一个新的缓冲区,对于每个像素,您执行gray = (pixel.red+pixel.green+pixel.blue)/3
之类的操作。或者您需要根据收到的数据创建一个普通的RGBA图像,然后将其转换为灰度。
但是在你的代码中你根本没有过渡。无论那里有什么类型的数据,您都使用CVPixelBufferGetBaseAddress
将原始指针带到缓冲区。然后你只需在创建图像时传递相同的指针,该图像假定接收的数据是灰度的。
答案 2 :(得分:1)
一种更现代的解决方案,其中包含方向定位,无需进行CGImage
转换即可提高性能:
func orientation() -> UIImage.Orientation {
let curDeviceOrientation = UIDevice.current.orientation
var exifOrientation: UIImage.Orientation
switch curDeviceOrientation {
case UIDeviceOrientation.portraitUpsideDown: // Device oriented vertically, Home button on the top
exifOrientation = .left
case UIDeviceOrientation.landscapeLeft: // Device oriented horizontally, Home button on the right
exifOrientation = .upMirrored
case UIDeviceOrientation.landscapeRight: // Device oriented horizontally, Home button on the left
exifOrientation = .down
case UIDeviceOrientation.portrait: // Device oriented vertically, Home button on the bottom
exifOrientation = .up
default:
exifOrientation = .up
}
return exifOrientation
}
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
guard let imageBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
let ciimage : CIImage = CIImage(cvPixelBuffer: imageBuffer)
let image = UIImage.init(ciImage: ciimage, scale: 1.0, orientation: orientation())
}