如何使用自定义相机捕获视频

时间:2017-06-27 01:10:10

标签: ios swift camera avcapturesession

我已经设置了我的自定义相机,并且已经编码了视频预览。我在屏幕上有一个按钮,用于在按下时捕获视频。我不知道如何去做。到目前为止,一切都已建立并正常运作。

在开始录制按钮功能中,我只需要捕获视频并保存所需的代码。谢谢

SqlSessionFactory

2 个答案:

答案 0 :(得分:1)

似乎Apple更喜欢开发人员使用默认摄像头捕获视频。如果你对此感到满意,我在网上找到了一个教程,里面有代码可以提供帮助。 https://www.raywenderlich.com/94404/play-record-merge-videos-ios-swift

您可以向下滚动到“录制视频”部分,它会引导您完成代码。

以下是其中的一些内容:“

import MobileCoreServices

您还需要采用与PlayVideoViewController相同的协议,方法是在文件末尾添加以下内容:

`extension RecordVideoViewController: UIImagePickerControllerDelegate {

}

extension RecordVideoViewController: UINavigationControllerDelegate { }

将以下代码添加到RecordVideoViewController:

`func startCameraFromViewController(viewController: UIViewController, withDelegate delegate: protocol<UIImagePickerControllerDelegate, UINavigationControllerDelegate>) -> Bool {
  if UIImagePickerController.isSourceTypeAvailable(.Camera) == false {
    return false
  }

  var cameraController = UIImagePickerController()
  cameraController.sourceType = .Camera
  cameraController.mediaTypes = [kUTTypeMovie as NSString as String]
  cameraController.allowsEditing = false
  cameraController.delegate = delegate

  presentViewController(cameraController, animated: true, completion: nil)
  return true
}`

此方法遵循PlayVideoViewController中的相同逻辑,但它访问.Camera而不是录制视频。 现在添加以下内容以记录(_:):

startCameraFromViewController(self, withDelegate: self)

你又熟悉了。点击“录制视频”按钮时,代码只需调用startCameraControllerFromViewController(_:usingDelegate:)

构建并运行以查看到目前为止您所拥有的内容。 转到“录制”屏幕,然后按“录制视频”按钮。相机UI将打开,而不是照片库。通过点击屏幕底部的红色记录按钮开始录制视频,并在录制完成后再次点击它。“

干杯, 西奥

答案 1 :(得分:0)

以下是工作代码,您需要在实际项目中正确处理optional值和error处理,但您可以使用下一个代码作为示例:

//
//  ViewController.swift
//  CustomCamera
//
//  Created by Taras Chernyshenko on 6/27/17.
//  Copyright © 2017 Taras Chernyshenko. All rights reserved.
//

import UIKit
import AVFoundation
import AssetsLibrary

class CameraViewController: UIViewController,
    AVCaptureAudioDataOutputSampleBufferDelegate,
    AVCaptureVideoDataOutputSampleBufferDelegate {

    @IBOutlet var recordOutlet: UIButton!
    @IBOutlet var recordLabel: UILabel!

    @IBOutlet var cameraView: UIView!
    var tempImage: UIImageView?

    private var session: AVCaptureSession = AVCaptureSession()
    private var deviceInput: AVCaptureDeviceInput?
    private var previewLayer: AVCaptureVideoPreviewLayer?
    private var videoOutput: AVCaptureVideoDataOutput = AVCaptureVideoDataOutput()
    private var audioOutput: AVCaptureAudioDataOutput = AVCaptureAudioDataOutput()

    private var videoDevice: AVCaptureDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
    private var audioConnection: AVCaptureConnection?
    private var videoConnection: AVCaptureConnection?

    private var assetWriter: AVAssetWriter?
    private var audioInput: AVAssetWriterInput?
    private var videoInput: AVAssetWriterInput?

    private var fileManager: FileManager = FileManager()
    private var recordingURL: URL?

    private var isCameraRecording: Bool = false
    private var isRecordingSessionStarted: Bool = false

    private var recordingQueue = DispatchQueue(label: "recording.queue")
    var captureSession: AVCaptureSession?
    var stillImageOutput: AVCapturePhotoOutput?
    var videoPreviewLayer: AVCaptureVideoPreviewLayer?
    var currentCaptureDevice: AVCaptureDevice?

    var usingFrontCamera = false

    /* This is the function i want to use to start
     recording a video */

    @IBAction func recordingButton(_ sender: Any) {
        if self.isCameraRecording {
            self.stopRecording()
        } else {
            self.startRecording()
        }
        self.isCameraRecording = !self.isCameraRecording
    }

    override func viewDidLoad() {
        super.viewDidLoad()
        self.setup()
    }

    private func setup() {
        self.session.sessionPreset = AVCaptureSessionPresetHigh

        self.recordingURL = URL(fileURLWithPath: "\(NSTemporaryDirectory() as String)/file.mov")
        if self.fileManager.isDeletableFile(atPath: self.recordingURL!.path) {
            _ = try? self.fileManager.removeItem(atPath: self.recordingURL!.path)
        }

        self.assetWriter = try? AVAssetWriter(outputURL: self.recordingURL!,
            fileType: AVFileTypeQuickTimeMovie)

        let audioSettings = [
            AVFormatIDKey : kAudioFormatAppleIMA4,
            AVNumberOfChannelsKey : 1,
            AVSampleRateKey : 16000.0
        ] as [String : Any]

        let videoSettings = [
            AVVideoCodecKey : AVVideoCodecH264,
            AVVideoWidthKey : UIScreen.main.bounds.size.width,
            AVVideoHeightKey : UIScreen.main.bounds.size.height
        ] as [String : Any]

        self.videoInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo,
             outputSettings: videoSettings)
        self.audioInput = AVAssetWriterInput(mediaType: AVMediaTypeAudio,
             outputSettings: audioSettings)

        self.videoInput?.expectsMediaDataInRealTime = true
        self.audioInput?.expectsMediaDataInRealTime = true

        if self.assetWriter!.canAdd(self.videoInput!) {
            self.assetWriter?.add(self.videoInput!)
        }

        if self.assetWriter!.canAdd(self.audioInput!) {
            self.assetWriter?.add(self.audioInput!)
        }

        self.deviceInput = try? AVCaptureDeviceInput(device: self.videoDevice)

        if self.session.canAddInput(self.deviceInput) {
            self.session.addInput(self.deviceInput)
        }

        self.previewLayer = AVCaptureVideoPreviewLayer(session: self.session)
        self.previewLayer?.videoGravity = AVLayerVideoGravityResizeAspect

        let rootLayer = self.view.layer
        rootLayer.masksToBounds = true
        self.previewLayer?.frame = CGRect(x: 0, y: 0, width: UIScreen.main.bounds.width, height: UIScreen.main.bounds.height)

        rootLayer.insertSublayer(self.previewLayer!, at: 0)

        self.session.startRunning()

        DispatchQueue.main.async {
            self.session.beginConfiguration()

            if self.session.canAddOutput(self.videoOutput) {
                self.session.addOutput(self.videoOutput)
            }

            self.videoConnection = self.videoOutput.connection(withMediaType: AVMediaTypeVideo)
            if self.videoConnection?.isVideoStabilizationSupported == true {
                self.videoConnection?.preferredVideoStabilizationMode = .auto
            }
            self.session.commitConfiguration()

            let audioDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio)
            let audioIn = try? AVCaptureDeviceInput(device: audioDevice)

            if self.session.canAddInput(audioIn) {
                self.session.addInput(audioIn)
            }

            if self.session.canAddOutput(self.audioOutput) {
                self.session.addOutput(self.audioOutput)
            }

            self.audioConnection = self.audioOutput.connection(withMediaType: AVMediaTypeAudio)
        }
    }

    private func startRecording() {
        if self.assetWriter?.startWriting() != true {
            print("error: \(self.assetWriter?.error.debugDescription ?? "")")
        }

        self.videoOutput.setSampleBufferDelegate(self, queue: self.recordingQueue)
        self.audioOutput.setSampleBufferDelegate(self, queue: self.recordingQueue)
    }

    private func stopRecording() {
        self.videoOutput.setSampleBufferDelegate(nil, queue: nil)
        self.audioOutput.setSampleBufferDelegate(nil, queue: nil)

        self.assetWriter?.finishWriting {
            print("saved")
        }
    }
    func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer
        sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {

        if !self.isRecordingSessionStarted {
            let presentationTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
            self.assetWriter?.startSession(atSourceTime: presentationTime)
            self.isRecordingSessionStarted = true
        }

        let description = CMSampleBufferGetFormatDescription(sampleBuffer)!

        if CMFormatDescriptionGetMediaType(description) == kCMMediaType_Audio {
            if self.audioInput!.isReadyForMoreMediaData {
                print("appendSampleBuffer audio");
                self.audioInput?.append(sampleBuffer)
            }
        } else {
            if self.videoInput!.isReadyForMoreMediaData {
                print("appendSampleBuffer video");
                if !self.videoInput!.append(sampleBuffer) {
                    print("Error writing video buffer");
                }
            }
        }
    }
}