用自定义相机Swift 3拍照

时间:2016-10-15 11:13:45

标签: ios swift swift3 avcapturesession

在Swift 2.3中我使用此代码在自定义相机中拍照:

 func didPressTakePhoto(){

        if let videoConnection = stillImageOutput!.connection(withMediaType: AVMediaTypeVideo) {

            stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (sampleBuffer, error) -> Void in
                if sampleBuffer != nil {
                    let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
                    let dataProvider = CGDataProviderCreateWithCFData(imageData)
                    let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)
                    let image = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right)


                    self.captureImageView.image = image
                }
            })

    }
}

但他的专栏:stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (sampleBuffer, error) -> Void in

显示此错误:

  

'AVCapturePhotoOutput'类型的值没有成员   'captureStillImageAsynchronouslyFromConnection'

我尝试解决了我的问题,但我总是遇到越来越多的错误,这就是我发布原始代码的原因。

有人知道如何让我的代码再次运行吗?

谢谢。

3 个答案:

答案 0 :(得分:7)

你可以在Swift 3中使用AVCapturePhotoOutput

您需要AVCapturePhotoCaptureDelegate返回CMSampleBuffer

如果您告诉AVCapturePhotoSettings previewFormat

,您也可以获得预览图像
class CameraCaptureOutput: NSObject, AVCapturePhotoCaptureDelegate {

    let cameraOutput = AVCapturePhotoOutput()

    func capturePhoto() {

      let settings = AVCapturePhotoSettings()
            let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
            let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
                                 kCVPixelBufferWidthKey as String: 160,
                                 kCVPixelBufferHeightKey as String: 160,
                                 ]
            settings.previewPhotoFormat = previewFormat
            self.cameraOutput.capturePhoto(with: settings, delegate: self)

    }
    func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: NSError?) {

        if let error = error {
            print(error.localizedDescription)
        }

        if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
          print(image: UIImage(data: dataImage).size)
        } else {

        }
    }
}

答案 1 :(得分:4)

感谢Sharpkits,我发现了我的解决方案(此代码适用于我):

func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {

        if let error = error {
            print(error.localizedDescription)
        }

        if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer,
            let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {

            let imageData = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: nil)
            let dataProvider = CGDataProvider(data: imageData as! CFData)

            let cgImageRef = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: CGColorRenderingIntent.absoluteColorimetric)


            let image = UIImage(cgImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.right)

            let cropedImage = self.cropToSquare(image: image)

            let newImage = self.scaleImageWith(cropedImage, and: CGSize(width: 600, height: 600))

            print(UIScreen.main.bounds.width)


            self.tempImageView.image = newImage
            self.tempImageView.isHidden = false


        } else {

        }
    }

答案 2 :(得分:2)

很棒的代码。非常感谢您的帮助和实例。

只是为了澄清那些像我一样慢的心智能力的人,当你在你的内部调用self.cameraOutput.capturePhoto(with:settings,delegate:self)时,会在幕后调用捕获(_ ... etc)方法。 takePhoto (或任何你的名字)方法。 您永远不会直接自己调用捕获方法。它会自动发生。