获取视频捕获IOS的每一帧的红色强度

时间:2014-03-25 21:58:26

标签: ios iphone objective-c avfoundation video-processing

好的,我正在做这个项目。任务是使用IPhone / Ipad相机测量心率。我正在尝试使用AVFoundation捕获视频,获取每个帧并对帧中每个像素的红色分量求和并将其除以大小以获得平均值。

我首先设置了视频

-(void) setupAVCapture{
    _session = [[AVCaptureSession alloc] init];
    _session.sessionPreset = AVCaptureSessionPresetMedium;
    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    NSError *error = nil;
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
    if (!error) {
        if ([device lockForConfiguration:&error]) {
            if ([device hasTorch] && [device isTorchModeSupported:AVCaptureTorchModeOn]) {
               [device setTorchMode:AVCaptureTorchModeOn];
            }
            [device unlockForConfiguration];
        }
        if ( [_session canAddInput:input] )
            [_session addInput:input];
        AVCaptureVideoDataOutput *videoDataOutput = [AVCaptureVideoDataOutput new];


        [videoDataOutput setVideoSettings:@{ (NSString *)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) }];
        [videoDataOutput setAlwaysDiscardsLateVideoFrames:YES];
        dispatch_queue_t videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL);
        [videoDataOutput setSampleBufferDelegate:self queue:videoDataOutputQueue];

        if ( [_session canAddOutput:videoDataOutput] )
            [_session addOutput:videoDataOutput];


        [_session startRunning];
    }
    else{
        UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:[NSString stringWithFormat:@"Failed with error %d", (int)[error code]] message:[error localizedDescription] delegate:nil cancelButtonTitle:@"Dismiss" otherButtonTitles:nil];
        [alertView show];
    }
}

然后使用委托方法,如下所示 -

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{
     // got an image
     CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
     //redScores property is an array which stores the red values of all the frames
     [self.redScores addObject: [NSNumber numberWithFloat:[self processPixelBuffer:pixelBuffer]]];
 }


-(float) processPixelBuffer:(CVPixelBufferRef) pixelBuffer{
    CVPixelBufferLockBaseAddress(pixelBuffer, 0);
    size_t bufferWidth = CVPixelBufferGetWidth(pixelBuffer);
    size_t bufferHeight = CVPixelBufferGetHeight(pixelBuffer);
    unsigned char *pixels = (unsigned char *)CVPixelBufferGetBaseAddress(pixelBuffer);
    int meanRedPixelWeight=0.0;
    for (int i = 0; i < (bufferWidth * bufferHeight); i++) {
        meanRedPixelWeight += pixels[2];
    }
    meanRedPixelWeight=meanRedPixelWeight/(bufferWidth*bufferHeight);
    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
    NSLog(@"%d",meanRedPixelWeight);
    return meanRedPixelWeight;
}

但这似乎没有给我正确的红色值。首先,我看到价值不断下降。它应该是上下。其次,我拍摄视频并在matlab中进行处理,例如 -

v=VideoReader('filepath');
noOfFrames = v.NumberOfFrames; 
x=zeros(1,numFrames);
for i=1:noOfFrames,
   frame = read(v,1);
   redPlane = frame(:, :, 1);
   x(i) = sum(sum(redPlane)) / (size(frame, 1) * size(frame, 2));

我的平均值非常不同。 matlab的接近255,所以我可以说它们是正确的,因为所有帧几乎都是红色的。

关于objective-c代码有什么问题的任何想法?

1 个答案:

答案 0 :(得分:0)

我感谢你找到了另一种解决方案,但是,作为参考,我认为这会有效:

- (float) processPixelBuffer:(CVPixelBufferRef) pixelBuffer{
  CVPixelBufferLockBaseAddress(pixelBuffer, 0);
  size_t pixelWidth = CVPixelBufferGetWidth(pixelBuffer);
  size_t pixelHeight = CVPixelBufferGetHeight(pixelBuffer);
  size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
  uint8_t *sourceBuffer = (uint8_t*)CVPixelBufferGetBaseAddress(pixelBuffer);
  CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
  int bufferSize = bytesPerRow * height;
  uint8_t *pixels = malloc(bufferSize);
  memcpy(pixels, sourceBuffer, bufferSize);      
  int totalRedPixelWeight = 0;
  for (int i = 0; i < pixelHeight; i++) {
    for (int ii = 0; ii < pixelWidth; ii+=4) {
      int redLoc = (bufferHeight * i) + ii + 2;
      totalRedPixelWeight += pixels[redLoc];
    }
  }
  free (pixels);
  float meanRedPixelWeight = totalRedPixelWeight/(pixelWidth*pixelHeight);
  NSLog(@"%f",meanRedPixelWeight);
  return meanRedPixelWeight;
}