我正在尝试使用GoogleWebRTC pod在iOS应用程序中实现WebRTC。我可以在iOS应用程序和Web客户端之间进行视频通话,在这种情况下音频/视频工作得很好。但是,当我在两个iOS设备之间进行视频通话时,没有视频(音频作品)。我已检查是否有远程流,并且有。
let localStream = connectionFactory?.mediaStream(withStreamId: "StreamID")
let audioTrack = connectionFactory?.audioTrack(withTrackId: "AudioTrackID")
let videoSource = connectionFactory?.avFoundationVideoSource(with: mediaConstraint)
let videoTrack = connectionFactory?.videoTrack(with: videoSource!, trackId: "VideoTrackID")
localStream?.addAudioTrack(audioTrack!)
localStream?.addVideoTrack(videoTrack!)
peerConnection?.add(localStream!)
答案 0 :(得分:2)
可能有很多事情,试试我的例子:https://github.com/redfearnk/WebRTCVideoChat
答案 1 :(得分:1)
据我了解,您可以在首次创建本地视频轨道时添加远程视频轨道,然后在从对等连接接收到视频轨道时,视频轨道将自动生成帧,这是WebRTC iOS客户端的示例代码:
- (void)createMediaSenders {
RTCMediaConstraints *constraints = [self defaultMediaAudioConstraints];
RTCAudioSource *source = [_factory audioSourceWithConstraints:constraints];
if (_isAudioEnabled) {
RTCAudioTrack *track = [_factory audioTrackWithSource:source trackId:kDSAudioTrackId];
[_peerConnection addTrack:track streamIds:@[ kDSMediaStreamId ]];
}
if (_isVideoEnabled) {
_localVideoTrack = [self createLocalVideoTrack];
if (_localVideoTrack) {
[_peerConnection addTrack:_localVideoTrack streamIds:@[ kDSMediaStreamId ]];
// Create remote video track
RTCVideoTrack *track = (RTCVideoTrack *)([self videoTransceiver].receiver.track);
[_delegate appRTC:self didReceiveRemoteVideoTrack:track];
}
}
}
答案 2 :(得分:0)
发现问题。我在创建流和视频轨道时给出了一个硬编码字符串作为id。当建立连接时,本地和远程流都变得相同。提供独特的字符串作为ID解决了这个问题。
答案 3 :(得分:0)
(客户端:RTCClient,didReceiveRemoteVideoTrack remoteVideoTrack:RTCVideoTrack) 执行一个将在2秒后调用的选择器,然后在该选择器中将曲目添加到remoteview