在iOS中镜像前置摄像头视频

时间:2018-12-16 10:03:33

标签: ios swift avfoundation cgaffinetransform

我正在使用AVFoundation将视频录制添加到应用程序中。我设法录制了视频然后进行显示,但是后来我意识到(与预览不同)前摄像头视频没有沿垂直轴镜像。看来这是标准行为,但我希望视频看起来像预览。我相信CGAffineTransform可以做到这一点,但是我不确定如何将其应用于视频。

这是我到目前为止的情况:

extension CameraViewController: AVCaptureFileOutputRecordingDelegate {
    func fileOutput(_ output: AVCaptureFileOutput, didFinishRecordingTo outputFileURL: URL, from connections: [AVCaptureConnection], error: Error?) {
        guard error != nil else {
            print("Error recording movie: \(error!.localizedDescription)")
            return
        }

        if self.currentCameraPosition == .front {
            mirrorVideo(outputFileURL)
        }
        performSegue(withIdentifier: "ShowVideo", sender: outputFileURL)
    }

    func mirrorVideo(_ outputFileURL: URL){
        var transform: CGAffineTransform = CGAffineTransform(scaleX: -1.0, y: 1.0)
        transform = transform.rotated(by: CGFloat(Double.pi/2))
        // Apply transform
    }
}

2 个答案:

答案 0 :(得分:1)

在将输入和输出添加到您的 AVCaptureSession

后使用此方法
    private func adjustVideoMirror(){

        guard let currentCameraInput: AVCaptureDeviceInput = captureSession.inputs.first as? AVCaptureDeviceInput else {
            return
        }

        if let conn = movieOutput.connection(with: .video){
            conn.isVideoMirrored = currentCameraInput.device.position == .front
        }

    }

键是 isVideoMirrored 属性

答案 1 :(得分:0)

根据我得到的答复和一些玩耍,我得出了一个答案:

extension CameraViewController: AVCaptureFileOutputRecordingDelegate {
    func fileOutput(_ output: AVCaptureFileOutput, didFinishRecordingTo outputFileURL: URL, from connections: [AVCaptureConnection], error: Error?) {
        if error != nil {
            print("Error recording movie: \(error!.localizedDescription)")
        } else {
            processMovie()
        }
    }

    func processMovie() {
        let asset = AVAsset(url: CameraViewController.movieURL)
        let composition = AVMutableComposition()
        let assetVideoTrack = asset.tracks(withMediaType: .video).last!
        let compositionVideoTrack = composition.addMutableTrack(withMediaType: AVMediaType.video,
                                                                preferredTrackID: CMPersistentTrackID(kCMPersistentTrackID_Invalid))
        try? compositionVideoTrack?.insertTimeRange(CMTimeRangeMake(start: CMTime.zero, duration: asset.duration),
                                                    of: assetVideoTrack,
                                                    at: CMTime.zero)
        if self.currentCameraPosition == .rear {
            compositionVideoTrack?.preferredTransform = assetVideoTrack.preferredTransform
        }
        if self.currentCameraPosition == .front {
            compositionVideoTrack?.preferredTransform = CGAffineTransform(scaleX: -1.0, y: 1.0).rotated(by: CGFloat(Double.pi/2))
        }

        if let exporter = AVAssetExportSession(asset: composition,
                                               presetName: AVAssetExportPresetHighestQuality) {
            exporter.outputURL = CameraViewController.exportMovieURL
            exporter.outputFileType = AVFileType.mov
            exporter.shouldOptimizeForNetworkUse = true
            exporter.exportAsynchronously() {
                DispatchQueue.main.async {
                    self.performSegue(withIdentifier: "ShowVideo", sender: nil)
                }
            }
        }
    }
}