Swift 3 - 自定义相机视图 - 显示按钮点击后拍摄的照片的静止图像

时间:2017-03-02 22:55:23

标签: ios swift swift3 avfoundation

我使用的是Swift 3,Xcode 8.2。

我有一个自定义相机视图,可以显示视频输入正常,还有一个我想用作快门的按钮。当用户点击按钮时,我想要拍摄一张照片并将其显示在屏幕上。 (例如像Snapchat或Facebook Messenger风格的相机行为)

这是我的代码:

import UIKit
import AVFoundation

class CameraVC: UIViewController, AVCapturePhotoCaptureDelegate {

   // this is where the camera feed from the phone is going to be displayed
    @IBOutlet var cameraView : UIView!

    var shutterButton : UIButton = UIButton.init(type: .custom)

    // manages capture activity and coordinates the flow of data from input devices to capture outputs.
    var capture_session = AVCaptureSession()

    // a capture output for use in workflows related to still photography.
    var session_output = AVCapturePhotoOutput()

    // preview layer that we will have on our view so users can see the photo we took
    var preview_layer = AVCaptureVideoPreviewLayer()

    // still picture image is what we show as the picture taken, frozen on the screen
    var still_picture_image : UIImage!

    ... //more code in viewWillAppear that sets up the camera feed

    // called when the shutter button is pressed
    func shutterButtonPressed() {

        // get the actual video feed and take a photo from that feed
        session_output.capturePhoto(with: AVCapturePhotoSettings.init(format: [AVVideoCodecKey : AVVideoCodecJPEG]), delegate: self as AVCapturePhotoCaptureDelegate)
    }

    func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {

        // take the session output, get the buffer, and create an image from that buffer
        if let sampleBuffer = photoSampleBuffer,
            let previewBuffer = previewPhotoSampleBuffer,
            let imageData = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {

            print("Here") // doesn't get here

        }

    }

在运行此代码时似乎没有打印“Here”,我找不到任何关于如何显示此图像的Swift 3教程。我猜我想要imageData并将其分配给我的still_picture_image并以某种方式覆盖相机Feed。

任何帮助或正确方向上的一点都会有很大的帮助。

修改

将以下代码添加到我的代码中后:

        if let error = error {
            print(error.localizedDescription)
        }

但我仍然没有打印任何错误。

2 个答案:

答案 0 :(得分:0)

Add the following code into your delegate method to print out the error being thrown:

if let error = error {
   print(error.localizedDescription)
}

Once you get your error resolved, I think this post should help you to extract the image: Taking photo with custom camera Swift 3

答案 1 :(得分:0)

好的,我发现了我的问题:

首先,将UIImageView拖到故事板上,让它占用整个屏幕。这是按下快门按钮后显示静态图片的地方。

在代码中创建该变量并将其链接。

@IBOutlet weak var stillPicture : UIImageView!

然后,在viewDidLoad中,确保在相机视图顶部插入UIImageView

self.view.insertSubview(stillPicture, aboveSubview: your_camera_view)

单击快门按钮时调用此功能:

func shutterButtonPressed() {

    let settings = AVCapturePhotoSettings()

    let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
    let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
                         kCVPixelBufferWidthKey as String: 160,
                         kCVPixelBufferHeightKey as String: 160,
                         ]
    settings.previewPhotoFormat = previewFormat
    session_output.capturePhoto(with: settings, delegate: self)
}

然后,在你的捕获委托中:

func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {
        if let error = error {
            print(error.localizedDescription)
        }

        // take the session output, get the buffer, and create an image from that buffer
        if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
            // this is the image that the user has taken!
            let takenImage : UIImage = UIImage(data: dataImage)!
            stillPicture?.image = takenImage                
        } else {
            print("Error setting up photo capture")
        }
}