在Swift 3上使用AVCaptureVideoDataOutput录制视频

时间:2017-01-23 08:47:07

标签: ios swift3 video-recording

经过相当长的一段时间我们在没有结果的情况下度过这个问题,我决定在这里问一下。

我们正在使用AVCaptureVideoDataOutput获取相机实时视频的像素数据并使用captureOutput功能。但我们也想用这些数据录制视频。此外,我们想知道这个视频录制是否会像使用AVCaptureMovieFileOutput制作的录制视频一样被压缩。

我想告诉您,使用AVCaptureMovieFileOutput我们录制没有问题。但AVCaptureMovieFileOutputAVCaptureVideoDataOutput同时无效。

您可以在下面找到我们的captureOutput功能;

    func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {

    let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!

    CVPixelBufferLockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))

    let baseAddress             = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0)
    let bytesPerRow             = CVPixelBufferGetBytesPerRow(imageBuffer)
        videoWidth              = CVPixelBufferGetWidth(imageBuffer)
        videoHeight             = CVPixelBufferGetHeight(imageBuffer)
    let colorSpace              = CGColorSpaceCreateDeviceRGB()

    var bitmapInfo  = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedLast.rawValue)

    let context = CGContext(data: baseAddress, width: videoWidth, height: videoHeight, bitsPerComponent: 8, bytesPerRow: bytesPerRow, space: colorSpace, bitmapInfo: bitmapInfo.rawValue)

    let imageRef = context!.makeImage()

    CVPixelBufferUnlockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))

    let data = imageRef!.dataProvider!.data as! NSData
    let pixels = data.bytes.assumingMemoryBound(to: UInt8.self)


   /* Because what we are doing with pixel data irrelevant  to the question we omitted the rest of the code to make it simple */




}

1 个答案:

答案 0 :(得分:1)

在度过生活的一部分后,我发现当我获取像素信息以对现场视频进行基本分析时,如何录制视频。

首先我设置AVAssetWriter并在给出实际记录顺序之前调用该函数。

var sampleBufferGlobal : CMSampleBuffer?
let writerFileName = "tempVideoAsset.mov"
var presentationTime : CMTime!
var outputSettings   = [String: Any]()
var videoWriterInput: AVAssetWriterInput!
var assetWriter: AVAssetWriter!


 func setupAssetWriter () {

    eraseFile(fileToErase: writerFileName)

    presentationTime  = CMSampleBufferGetPresentationTimeStamp(sampleBufferGlobal!)

    outputSettings = [AVVideoCodecKey   : AVVideoCodecH264,
                      AVVideoWidthKey   : NSNumber(value: Float(videoWidth)),
                      AVVideoHeightKey  : NSNumber(value: Float(videoHeight))] as [String : Any]

    videoWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: outputSettings)


    assetWriter = try? AVAssetWriter(outputURL: createFileURL(writerFileName), fileType: AVFileTypeQuickTimeMovie) 

    assetWriter.add(videoWriterInput)

}

我写了另一个函数进行录制,并在captureOutput函数中调用该函数,在我将sampleBuffer复制到sampleBufferGlobal,sampleBufferGlobal = sampleBuffer之后,在同一函数中进行录制。

func writeVideoFromData() {

    if assetWriter?.status == AVAssetWriterStatus.unknown {

        if (( assetWriter?.startWriting ) != nil) {

            assetWriter?.startWriting()
            assetWriter?.startSession(atSourceTime:  presentationTime)

        }   
    }   



      if assetWriter?.status == AVAssetWriterStatus.writing {   

            if (videoWriterInput.isReadyForMoreMediaData == true) {


                if  videoWriterInput.append(sampleBufferGlobal!) == false {

                    print(" we have a problem writing video")

                }   
            }  
        }        
   }

然后停止录制我使用了以下功能。

   func stopAssetWriter() {

    videoWriterInput.markAsFinished()

    assetWriter?.finishWriting(completionHandler: {


        if (self.assetWriter?.status == AVAssetWriterStatus.failed) {

            print("creating movie file is failed ")

        } else {

            print(" creating movie file was a success ")

            DispatchQueue.main.async(execute: { () -> Void in




            })

        }

    })

}