我有一个webrtc应用,但我想要更多地控制视频观看。所以我忙于实现自己的渲染器。我到了通过这个回调开始接收帧的地步:
renderer:(RTCVideoRenderer *)renderer didReceiveFrame:(RTCI420Frame *)frame
它传递RTCI420Frame
作为参数。我需要以某种方式将frame
中的图像数据转换为纹理。我无法弄清楚如何。我不认为GLKTextureLoader
可以实现这一点(这就是我现在用来从磁盘加载纹理的东西)。
如何从frame
中获取图像数据并从中创建纹理?
答案 0 :(得分:1)
Here is my current class处理我的应用中与WebRTC相关的所有内容。
这些类是iOS上原生WebRTC的起点。只需在BWRTCViewController上扩展视图控制器并设置委托。现在您可以立即开始测试,而无需担心自己实施整个呼叫序列。你只需要担心信号。
// your call view controller .h
#import <BWRTCViewController.h>
@interface CallViewController : BWRTCViewController <BWRTCViewControllerDelegate>
@end
// your call view controller .m
- (void)viewDidLoad {
[super viewDidLoad];
if (/*this is the caller*/) {
[super callerSequence];
// wait until callee is ready to receive your offer, then call:
[super startNegotiating];
} else {
/*callee side doesn't have to do a thing*/
}
}
// received a remote sdp
- (void)receivedSdp {
[super receivedSessionDescription:/*your sdp description*/
withType:/*your sdp type*/];
}
// received a remote ice candidate
- (void)receivedIce {
[super receivedIceCandidate:/*your ice candidate*/
sdpMid:/*your ice sdpMid*/
sdpMLineIndex:/*your ice sdpMLineIndex*/];
}
// got a local sdp
- (void) sendSessionDescription:(NSString *)sessionDescription_
withType:(NSString *)type_ {
// use your signaling interface to send the sdp to the remote peer
}
// got a local ice candidate
- (void) sendICECandidate:(NSString *)candidate_
sdpMid:(NSString *)sdpMid_
sdpMLineIndex:(NSInteger)sdpMLineIndex_ {
// use your signaling interface to send the ice to the remote peer
}