我正在尝试使用Google的repo中的webrtc库。我按照步骤创建了一个单独的项目,其中包含类似于APPRTC的说明和代码,我能够使其正常运行。我能够在两台设备之间进行会议。但是当我尝试与较旧的项目集成时,Webrtc崩溃了。以下是重现崩溃的步骤。
当我尝试在下面的代码片段中创建VideoSource时,我遇到了崩溃。任何提示或建议表示赞赏。
- (RTCVideoTrack *)createLocalVideoTrack {
RTCVideoTrack *localVideoTrack = nil;
if (_peerConnection && self.localMediaStream) {
[_peerConnection removeStream:self.localMediaStream];
self.localMediaStream=nil;
self.localVideoTrack=nil;
self.localAudioTrack=nil;
}
NSString *cameraID = nil;
AVCaptureDevicePosition devicePosition;
if (self.captureDevice == kWebrtcMediaCaptureDeviceFrontCam) {
devicePosition = AVCaptureDevicePositionFront;
}
else{
devicePosition = AVCaptureDevicePositionBack;
}
for (AVCaptureDevice *captureDevice in
[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]) {
if (captureDevice.position == devicePosition) {
//[self configureCameraForHighestFrameRate:captureDevice];
cameraID = [captureDevice localizedName];
break;
}
}
NSAssert(cameraID, @"Unable to get the front camera id");
RTCVideoCapturer *capturer = [RTCVideoCapturer capturerWithDeviceName:cameraID];
RTCMediaConstraints *mediaConstraints = [self defaultMediaStreamConstraints];
RTCVideoSource *videoSource = [_factory videoSourceWithCapturer:capturer
constraints:mediaConstraints];
localVideoTrack = [_factory videoTrackWithID:@"ARDAMSv0" source:videoSource];
return localVideoTrack;
}
And the crash log
* thread #1: tid = 0x33125c, 0x320b9b2c libsystem_kernel.dylib`__psynch_cvwait + 24, queue = 'com.apple.main-thread'
frame #0: 0x320b9b2c libsystem_kernel.dylib`__psynch_cvwait + 24
frame #1: 0x32137388 libsystem_pthread.dylib`_pthread_cond_wait + 520
frame #2: 0x3213826c libsystem_pthread.dylib`pthread_cond_wait + 40
frame #3: 0x00515230 `rtc::Event::Wait(int) + 160
* frame #4: 0x003e4912 `webrtc::MethodCall2<webrtc::PeerConnectionFactoryInterface, rtc::scoped_refptr<webrtc::VideoSourceInterface>, cricket::VideoCapturer*, webrtc::MediaConstraintsInterface const*>::Marshal(rtc::Thread*) + 46
frame #5: 0x003e419c `webrtc::PeerConnectionFactoryProxy::CreateVideoSource(cricket::VideoCapturer*, webrtc::MediaConstraintsInterface const*) + 68
frame #6: 0x00414470 `-[RTCPeerConnectionFactory videoSourceWithCapturer:constraints:] + 192
frame #7: 0x0001fc4e `-[WebrtcManager createLocalVideoTrack](self=0x01896620, _cmd=0x0083e058) + 1662 at WebrtcManager.m:360
frame #8: 0x0001ca96 `__40-[WebrtcManager initializeWebrtcManager]_block_invoke(.block_descriptor=<unavailable>) + 46 at WebrtcManager.m:46
frame #9: 0x01420172 libdispatch.dylib`_dispatch_call_block_and_release + 10
frame #10: 0x0142015e libdispatch.dylib`_dispatch_client_callout + 22
frame #11: 0x01423e44 libdispatch.dylib`_dispatch_main_queue_callback_4CF + 1512
frame #12: 0x234ad608 CoreFoundation`__CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 8
frame #13: 0x234abd08 CoreFoundation`__CFRunLoopRun + 1512
frame #14: 0x233f8200 CoreFoundation`CFRunLoopRunSpecific + 476
frame #15: 0x233f8012 CoreFoundation`CFRunLoopRunInMode + 106
frame #16: 0x2ac91200 GraphicsServices`GSEventRunModal + 136
frame #17: 0x26b9ca58 UIKit`UIApplicationMain + 1440
frame #18: 0x00279f60 `main(argc=1, argv=0x00e37a78) + 132 at main.m:17
答案 0 :(得分:2)
我的坏!我试图在工作线程中创建PeerConnectionFactory和LocalVideoTrack!当我将它们移动到主线程时问题解决了。我已在Apprtc-Swift上传了apprtc版本,其中包含this tutorial
的说明