拍摄AVCaptureVideoPreviewLayer的视图

时间:2017-03-05 13:28:44

标签: ios uiview webrtc avcapturesession

我使用WebRTC在两个用户之间建立视频聊天。我想拍摄localView视图的快照,其中显示了其中一个人。

这是我的班级,使用configureLocalPreview方法将视频流与UIViews连接起来:

@IBOutlet var remoteView: RTCEAGLVideoView!
@IBOutlet var localView: UIView!

var captureSession: AVCaptureSession?
var videoSource: RTCAVFoundationVideoSource?
var videoTrack: RTCVideoTrack?

func configureLocalPreview() {
    self.videoTrack = self.signaling.localMediaStream.self.videoTracks.first as! RTCVideoTrack?
    self.videoSource = (self.videoTrack?.source as? RTCAVFoundationVideoSource)
    self.captureSession = self.videoSource?.self.captureSession

    self.previewLayer = AVCaptureVideoPreviewLayer.init(session: self.captureSession)
    self.previewLayer.frame = self.localView.bounds
    self.localView.layer.addSublayer(self.previewLayer)
    self.localView.isUserInteractionEnabled = true
    //self.localView.layer.position = CGPointMake(100, 100);
}

在我想要访问快照的地方,我打电话给:

self.localView.pb_takeSnapshot()

pb_takeSnapshot来自我在另一篇文章中找到的UIView扩展。它的定义如下:

extension UIView {
    func pb_takeSnapshot() -> UIImage {
    UIGraphicsBeginImageContextWithOptions(bounds.size, false, UIScreen.main.scale)

    drawHierarchy(in: self.bounds, afterScreenUpdates: true)

    let image = UIGraphicsGetImageFromCurrentImageContext()!
    UIGraphicsEndImageContext()
    return image
    }
}

当我看一下Xcode调试器中的图像时,它看起来完全是绿色的,而我在iphone屏幕上(在该视图内)可以看到的那个人不在那里:

screenshot of the snapshot

这个人不可见的原因是什么?难道不知道是不可能制作一个流的快照?谢谢你看看!

3 个答案:

答案 0 :(得分:3)

您应该使用RTCEAGLVideoView而不是UIView创建localView。我使用相同的localView,并使用您的帖子中提到的相同的代码片段来拍摄快照。

以下是启动相机并显示本地预览的示例代码:

class ViewController: UIViewController,RTCEAGLVideoViewDelegate {

var captureSession: AVCaptureSession?
var previewLayer :AVCaptureVideoPreviewLayer?
var peerConnectionFactory: RTCPeerConnectionFactory!
var videoSource:RTCAVFoundationVideoSource!
var localTrack :RTCVideoTrack!

@IBOutlet var myView: UIView!
override func viewDidLoad() {
    super.viewDidLoad()
    /*myView = UIView(frame: CGRect(x: 0,
                                 y: 0,
                                 width: UIScreen.main.bounds.size.width,
                                 height: UIScreen.main.bounds.size.height))*/
    startCamera()
    // Do any additional setup after loading the view, typically from a nib.
}

fileprivate func startCamera() {

    peerConnectionFactory = RTCPeerConnectionFactory()
    RTCInitializeSSL();
    RTCSetupInternalTracer();
    RTCSetMinDebugLogLevel(RTCLoggingSeverity.info)

    videoSource = peerConnectionFactory.avFoundationVideoSource(with: nil);


    localTrack  = peerConnectionFactory.videoTrack(with: videoSource, trackId: "ARDAMSv0")



    let localScaleX = CGFloat(1.0)
    let localView : RTCEAGLVideoView = RTCEAGLVideoView(frame: self.view.bounds)
    self.view.insertSubview(localView, at: 1)
    localView.frame = self.view.bounds;
    localView.transform = CGAffineTransform(scaleX: localScaleX, y: 1)

    localTrack.add(localView)
}


override func didReceiveMemoryWarning() {
    super.didReceiveMemoryWarning()
    // Dispose of any resources that can be recreated.
}

override func viewDidAppear(_ animated: Bool) {
    //previewLayer?.frame.size = myView.frame.size
}

func videoView(_ videoView: RTCEAGLVideoView, didChangeVideoSize size: CGSize) {
    print("Inside didChangeVideoSize")
}

}

答案 1 :(得分:2)

由于AVCaptureVideoPreviewLayer是作为OpenGL层实现的,因此您无法使用常规的CoreGraphic上下文。我建议尝试访问原始数据。

使用委托添加AVCaptureVideoDataOutput

previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)

let captureVideoOutput = AVCaptureVideoDataOutput()
captureVideoOutput.setSampleBufferDelegate(self, queue: DispatchQueue.main)
captureSession?.addOutput(captureVideoOutput)

previewLayer.frame = localView.bounds

使您的控制器(或其他)符合AVCaptureVideoDataOutputSampleBufferDelegate

声明shouldCaptureFrame变量并在需要拍照时进行设置。

var shouldCaptureFrame: Bool = false
...
func takeSnapshot() {
  shouldCaptureFrame = true
}

从委托实施didOutputSampleBuffer

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
  if !shouldCaptureFrame {
    return
  }

  let image = UIImage.from(sampleBuffer: sampleBuffer)
  shouldCaptureFrame = false
}

最后,from(sampleBuffer:)函数的扩展名为

extension UIImage {

    static func from(sampleBuffer: CMSampleBuffer) -> UIImage? {
        guard let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
          return nil
        }
        CVPixelBufferLockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))
        let baseAddresses = CVPixelBufferGetBaseAddress(imageBuffer)
        let colorSpace = CGColorSpaceCreateDeviceRGB()
        let context = CGContext(
            data: baseAddresses,
            width: CVPixelBufferGetWidth(imageBuffer),
            height: CVPixelBufferGetHeight(imageBuffer),
            bitsPerComponent: 8,
            bytesPerRow: CVPixelBufferGetBytesPerRow(imageBuffer),
            space: colorSpace,
            bitmapInfo: CGBitmapInfo.byteOrder32Little.rawValue
        )
        let quartzImage = context?.makeImage()
        CVPixelBufferUnlockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))

        if let quartzImage = quartzImage {
          let image = UIImage(cgImage: quartzImage)
          return image
        }

        return nil
    }

}

答案 2 :(得分:1)

对于WebRTC视频图层,您应该使用RTCEAGLVideoView作为视图。有关更多详细信息,请查看此处的AppRTC App

上的WebRTC示例应用程序