在长按手势上录制视频

时间:2017-04-03 01:34:57

标签: ios iphone video avfoundation

我目前有一个应用程序,允许我拍照并将其显示给用户并应用了一些过滤功能。

我还有一个预览图层,向用户显示相机当前看到的实时视频流。如果用户点击预览,则会拍摄图像,如果再次点按,则会将其恢复为直播。

这很好用但是现在,我希望能够用视频做同样的事情,而不是点击,我想使用类似于Vine之类的应用程序的长按记录。目前,我可以使用此方法录制视频并将其保存到我的设备中。我确信我的录音工作正常,因为我可以通过下载我的应用程序容器并查看其临时文件来查看它。

我希望能够拍摄视频,一旦我的手指被释放,将视频呈现给用户,如果再次点击,则再次向他们显示直播,以便全部启动。

这是我长按手势的代码:

func longPressView(_ sender: UILongPressGestureRecognizer) {
        if(isRecording) {
            takeVideo();
        }
        isRecording = false;

        if(longPressGesture?.state == UIGestureRecognizerState.cancelled ||
            longPressGesture?.state == UIGestureRecognizerState.failed ||
            longPressGesture?.state == UIGestureRecognizerState.ended) {
            videoFileOutput?.stopRecording();
        }
    }

以下是我的代表职能:

func capture(_ captureOutput: AVCaptureFileOutput!, didStartRecordingToOutputFileAt fileURL: URL!, fromConnections connections: [Any]!) {
        print("You started recording!");
    }

func capture(_ captureOutput: AVCaptureFileOutput!, didFinishRecordingToOutputFileAt outputFileURL: URL!, fromConnections connections: [Any]!, error: Error!) {
        print("Finished recording: \(outputFileURL)");
        videoFileOutput?.stopRecording();
        getVideo(outputFileURL: outputFileURL as NSURL);
    }

作为参考,在拍摄静止图像时,我使用了以下代理:

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
        DispatchQueue.main.async {
            self.imageOutput = self.getImageFromSampleBuffer(buffer: sampleBuffer);
        }
    }

然后我会将imageOutput变量设置为等于我的imageViews图像,以便向用户显示静止图像。

有关如何向用户展示他们刚拍摄的视频的任何想法吗?

提前致谢!

更新:4-3-2017

这是我的Class定义及其实例变量:

class Camera: UIViewController, UIImagePickerControllerDelegate, UINavigationControllerDelegate, AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureFileOutputRecordingDelegate {

    @IBOutlet weak var cameraView: UIView!
    @IBOutlet weak var imageView: UIImageView!

    var captureSession: AVCaptureSession?
    var previewLayer: AVCaptureVideoPreviewLayer?
    var dataOutput: AVCaptureVideoDataOutput?
    var imageOutput: UIImage?
    var videoFileOutput: AVCaptureMovieFileOutput?;

    var didTakePhoto = Bool();
    var isRecording: Bool = true;

另外,我的viewWillAppear方法:

override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated);

        captureSession = AVCaptureSession();
        captureSession?.sessionPreset = AVCaptureSessionPreset1920x1080;

        let backCamera = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo);

        do {
            let input = try AVCaptureDeviceInput(device: backCamera);

            if (captureSession?.canAddInput(input))! {
                captureSession?.addInput(input);

                dataOutput = AVCaptureVideoDataOutput();
                dataOutput?.videoSettings = [(kCVPixelBufferPixelFormatTypeKey as NSString):NSNumber(value:kCVPixelFormatType_32BGRA)];
                dataOutput?.alwaysDiscardsLateVideoFrames = true;

                if (captureSession?.canAddOutput(dataOutput))! {
                    captureSession?.addOutput(dataOutput);

                    previewLayer = AVCaptureVideoPreviewLayer(session: captureSession);
                    previewLayer?.videoGravity = AVLayerVideoGravityResizeAspect;
                    previewLayer?.connection.videoOrientation = AVCaptureVideoOrientation.portrait;
                    cameraView.layer.addSublayer(previewLayer!);
                    captureSession?.startRunning();

                    let queue = DispatchQueue(label: "com.RafaelNegron");
                    dataOutput?.setSampleBufferDelegate(self, queue: queue);
                }
            }
        } catch {
            print(error.localizedDescription);
        }
    }

0 个答案:

没有答案