如何使用多对等连接将摄像头从一个iOS设备流式传输到另一个设备

时间:2014-09-12 11:46:12

标签: ios bluetooth core-bluetooth avcapturesession multipeer-connectivity

我们如何在iOS 7中使用蓝牙或wifi有效地将摄像头源从一个iOS设备传输到另一个iOS设备。下面是获取流缓冲区的代码。

- (void)captureOutput:(AVCaptureOutput *)captureOutput
         didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
         fromConnection:(AVCaptureConnection *)connection
{
    // Create a UIImage from the sample buffer data
    UIImage *image = [self imageFromSampleBuffer:sampleBuffer];


}

    // Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
    // Get a CMSampleBuffer's Core Video image buffer for the media data
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer, 0);

    // Get the number of bytes per row for the pixel buffer
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);

    // Get the number of bytes per row for the pixel buffer
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    // Get the pixel buffer width and height
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);

    // Create a device-dependent RGB color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

    // Create a bitmap graphics context with the sample buffer data
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
      bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    // Create a Quartz image from the pixel data in the bitmap graphics context
    CGImageRef quartzImage = CGBitmapContextCreateImage(context);
    // Unlock the pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    // Free up the context and color space
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);

    // Create an image object from the Quartz image
    UIImage *image = [UIImage imageWithCGImage:quartzImage];

    // Release the Quartz image
    CGImageRelease(quartzImage);

    return (image);
}

这里我们可以获得iOS相机捕获的图像。

我们可以使用多端口将样本缓冲区信息直接发送到另一台设备,还是有任何有效的方法将数据传输到其他iOS设备?

谢谢。

2 个答案:

答案 0 :(得分:1)

我找到了这样做的方法,我们可以使用多对等连接来传输压缩图像,使其看起来像是相机流。

将要发送流的一个对等方将使用此代码。在captureOutput Delegate方法中:

     NSData *imageData = UIImageJPEGRepresentation(cgBackedImage, 0.2);

    // maybe not always the correct input?  just using this to send current FPS...
    AVCaptureInputPort* inputPort = connection.inputPorts[0];
    AVCaptureDeviceInput* deviceInput = (AVCaptureDeviceInput*) inputPort.input;
    CMTime frameDuration = deviceInput.device.activeVideoMaxFrameDuration;
    NSDictionary* dict = @{
                           @"image": imageData,
                           @"timestamp" : timestamp,
                           @"framesPerSecond": @(frameDuration.timescale)
                           };
    NSData *data = [NSKeyedArchiver archivedDataWithRootObject:dict];


    [_session sendData:data toPeers:_session.connectedPeers withMode:MCSessionSendDataReliable error:nil];

在接收方:

- (void)session:(MCSession *)session didReceiveData:(NSData *)data fromPeer:(MCPeerID *)peerID {

//    NSLog(@"(%@) Read %d bytes", peerID.displayName, data.length);

    NSDictionary* dict = (NSDictionary*) [NSKeyedUnarchiver unarchiveObjectWithData:data];
    UIImage* image = [UIImage imageWithData:dict[@"image"] scale:2.0];
    NSNumber* framesPerSecond = dict[@"framesPerSecond"];


}

我们将获得FPS值,因此我们可以设置参数来管理流媒体图像。

希望它会有所帮助。

谢谢。

答案 1 :(得分:1)

这是最好的方法(并且,我在最后解释了原因):

在iOS设备上发送图像数据:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

    CVPixelBufferLockBaseAddress(imageBuffer,0);
    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    CGImageRef newImage = CGBitmapContextCreateImage(newContext);


    UIImage *image = [[UIImage alloc] initWithCGImage:newImage scale:1 orientation:UIImageOrientationUp];
    CGImageRelease(newImage);
    CGContextRelease(newContext);
    CGColorSpaceRelease(colorSpace);
    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);

    if (image) {
        NSData *data = UIImageJPEGRepresentation(image, 0.7);
        NSError *err;
        [((ViewController *)self.parentViewController).session sendData:data toPeers:((ViewController *)self.parentViewController).session.connectedPeers withMode:MCSessionSendDataReliable error:&err];
    }
}

在接收图像数据的iOS设备上:

typedef struct {
    size_t length;
    void *data;
} ImageCacheDataStruct;

- (void)session:(nonnull MCSession *)session didReceiveData:(nonnull NSData *)data fromPeer:(nonnull MCPeerID *)peerID
{
  dispatch_async(self.imageCacheDataQueue, ^{
        dispatch_semaphore_wait(self.semaphore, DISPATCH_TIME_FOREVER);
        const void *dataBuffer = [data bytes];
        size_t dataLength = [data length];
        ImageCacheDataStruct *imageCacheDataStruct = calloc(1, sizeof(imageCacheDataStruct));
        imageCacheDataStruct->data = (void*)dataBuffer;
        imageCacheDataStruct->length = dataLength;

        __block const void * kMyKey;
        dispatch_queue_set_specific(self.imageDisplayQueue, &kMyKey, (void *)imageCacheDataStruct, NULL);

        dispatch_sync(self.imageDisplayQueue, ^{
            ImageCacheDataStruct *imageCacheDataStruct = calloc(1, sizeof(imageCacheDataStruct));
            imageCacheDataStruct = dispatch_queue_get_specific(self.imageDisplayQueue, &kMyKey);
            const void *dataBytes = imageCacheDataStruct->data;
            size_t length = imageCacheDataStruct->length;
            NSData *imageData = [NSData dataWithBytes:dataBytes length:length];
            UIImage *image = [UIImage imageWithData:imageData];
            if (image) {
                dispatch_async(dispatch_get_main_queue(), ^{
                    [((ViewerViewController *)self.childViewControllers.lastObject).view.layer setContents:(__bridge id)image.CGImage];
                    dispatch_semaphore_signal(self.semaphore);
                });
            }
        });
    });
}

信号量和单独的GCD队列的原因很简单:您希望帧以相等的时间间隔显示。否则,视频有时会慢慢放慢速度,然后加速到正常速度以便赶上。我的方案确保每个帧以相同的速度一个接一个播放,无论网络带宽瓶颈如何。