Quickblox视频聊天保存

时间:2014-07-08 14:56:19

标签: ios objective-c video chat quickblox

我在我的应用中使用QuickBlox iOS SDK进行vidoe聊天。它工作正常。现在我想录制聊天视频并将其保存在相机胶卷中。我怎样才能做到这一点。 我已经完成了他们的文档并实现了这个 -

 -(IBAction)record:(id)sender{


   // Create video Chat
   videoChat = [[QBChat instance] createAndRegisterVideoChatInstance];
   [videoChat setIsUseCustomVideoChatCaptureSession:YES];

   // Create capture session
    captureSession = [[AVCaptureSession alloc] init];

   // ... setup capture session here

   /*We create a serial queue to handle the processing of our frames*/
   dispatch_queue_t callbackQueue= dispatch_queue_create("cameraQueue", NULL);
  [videoCaptureOutput setSampleBufferDelegate:self queue:callbackQueue];

  /*We start the capture*/
  [captureSession startRunning];
   }

 -(void)captureOutput:(AVCaptureOutput *)captureOutput  didOutputSampleBuffer: (CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

  // Do something with samples
  // ...

  // forward video samples to SDK
  [videoChat processVideoChatCaptureVideoSample:sampleBuffer];
 }

但我不知道该怎么做。 我该如何获取视频数据?

1 个答案:

答案 0 :(得分:0)

来自quickblox docs

要设置自定义视频捕获会话,只需按照以下步骤操作:

创建AVCaptureSession的实例 设置输入和输出 实现帧回调并将所有帧转发到QuickBlox iOS SDK 告诉QuickBlox SDK您将使用自己的捕获会话

要设置自定义视频捕获会话,请设置输入和输出:

-(void) setupVideoCapture{
self.captureSession = [[AVCaptureSession alloc] init];

__block NSError *error = nil;

// set preset
[self.captureSession setSessionPreset:AVCaptureSessionPresetLow];


// Setup the Video input
AVCaptureDevice *videoDevice = [self frontFacingCamera];
//
AVCaptureDeviceInput *captureVideoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if(error){
    QBDLogEx(@"deviceInputWithDevice Video error: %@", error);
}else{
    if ([self.captureSession  canAddInput:captureVideoInput]){
        [self.captureSession addInput:captureVideoInput];
    }else{
        QBDLogEx(@"cantAddInput Video");
    }
}

// Setup Video output
AVCaptureVideoDataOutput *videoCaptureOutput = [[AVCaptureVideoDataOutput alloc] init];
videoCaptureOutput.alwaysDiscardsLateVideoFrames = YES;
//
// Set the video output to store frame in BGRA (It is supposed to be faster)
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[videoCaptureOutput setVideoSettings:videoSettings];
/*And we create a capture session*/
if([self.captureSession canAddOutput:videoCaptureOutput]){
    [self.captureSession addOutput:videoCaptureOutput];
}else{
    QBDLogEx(@"cantAddOutput");
}
[videoCaptureOutput release];


// set FPS
int framesPerSecond = 3;
AVCaptureConnection *conn = [videoCaptureOutput connectionWithMediaType:AVMediaTypeVideo];
if (conn.isVideoMinFrameDurationSupported){
    conn.videoMinFrameDuration = CMTimeMake(1, framesPerSecond);
}
if (conn.isVideoMaxFrameDurationSupported){
    conn.videoMaxFrameDuration = CMTimeMake(1, framesPerSecond);
}

/*We create a serial queue to handle the processing of our frames*/
dispatch_queue_t callbackQueue= dispatch_queue_create("cameraQueue", NULL);
[videoCaptureOutput setSampleBufferDelegate:self queue:callbackQueue];
dispatch_release(callbackQueue);

// Add preview layer
AVCaptureVideoPreviewLayer *prewLayer = [[[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession] autorelease];
[prewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CGRect layerRect = [[myVideoView layer] bounds];
[prewLayer setBounds:layerRect];
[prewLayer setPosition:CGPointMake(CGRectGetMidX(layerRect),CGRectGetMidY(layerRect))];
myVideoView.hidden = NO;
[myVideoView.layer addSublayer:prewLayer];


/*We start the capture*/
[self.captureSession startRunning];
}

- (AVCaptureDevice *) cameraWithPosition:(AVCaptureDevicePosition) position{
    NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    for (AVCaptureDevice *device in devices) {
        if ([device position] == position) {
            return device;
        }
    }
    return nil;
}


- (AVCaptureDevice *) backFacingCamera{
    return [self cameraWithPosition:AVCaptureDevicePositionBack];
}

- (AVCaptureDevice *) frontFacingCamera{
    return [self cameraWithPosition:AVCaptureDevicePositionFront];
}

实施帧回调:

- (void)captureOutput:(AVCaptureOutput *)captureOutput  didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

    // Usually we just forward camera frames to QuickBlox SDK
    // But we also can do something with them before, for example - apply some video filters or so  
    [self.videoChat processVideoChatCaptureVideoSample:sampleBuffer];
}

告诉我们使用自己的视频捕获会话的QuickBlox iOS SDK:

self.videoChat = [[QBChat instance] createAndRegisterVideoChatInstance];
self.videoChat.viewToRenderOpponentVideoStream = opponentVideoView;
//
// we use own video capture session
self.videoChat.isUseCustomVideoChatCaptureSession = YES;