iOS(原生)上的GoogleWebRTC与ARKit结合使用

时间:2018-02-07 10:51:39

标签: ios swift webrtc arkit

我将Google的WebRTC框架与ARKit结合使用。

我使用这个Podfile安装了WebRTC:

platform :ios, '11.3'

target 'myApp' do
  use_framework!

  pod "GoogleWebRTC"
end

我使用此音频工作:

let audioDiscovery = AVCaptureDevice.DiscoverySession(deviceTypes: [AVCaptureDevice.DeviceType.builtInMicrophone], mediaType: .audio, position: AVCaptureDevice.Position.unspecified)
let audioDevice = audioDiscovery.devices[0]

let audioTrack = self.webRTCPeerConnectionFactory.audioTrack(withTrackId: audioDevice.uniqueID)
self.localMediaStream?.addAudioTrack(audioTrack)

self.webRTCPeerConnection?.add(self.localMediaStream!)

然后我尝试了相同的视频(在将流添加到连接之前):

let videoDiscovery = AVCaptureDevice.DiscoverySession(deviceTypes: [AVCaptureDevice.DeviceType.builtInWideAngleCamera], mediaType: .video, position: AVCaptureDevice.Position.back)
let videoDevice = videoDiscovery.devices[0]

let videoSource = self.webRTCPeerConnectionFactory.videoSource()
let videoTrack = self.webRTCPeerConnectionFactory.videoTrack(with: videoSource, trackId: videoDevice.uniqueID)
self.localMediaStream?.addVideoTrack(videoTrack)

但我无法在另一边看到视频流。

我使用带有RTCVideoCapturer的AVCaptureDevice解决了这个问题,但是知道ARKit并没有在屏幕上显示任何视频。

0 个答案:

没有答案