如何在Xcode中按帧获取相框

时间:2013-03-14 12:23:25

标签: iphone ios xcode camera

嗨我想使用iphone后置摄像头逐帧拍摄相框。到目前为止我做了什么。

  1. 我以完整模式打开相机。
  2.   
        
    • (IBAction)showCameraUI {   [self startCameraControllerFromViewController:self                                   usingDelegate:self];   }

    •   
    • (BOOL)startCameraControllerFromViewController:(UIViewController *)控制器                                  usingDelegate:(id)委托{

    •   
    if (([UIImagePickerController isSourceTypeAvailable:
          UIImagePickerControllerSourceTypeCamera] == NO)
        || (delegate == nil)
        || (controller == nil))
        return NO;
    
    
    UIImagePickerController *cameraUI = [[UIImagePickerController alloc] init];
    cameraUI.sourceType = UIImagePickerControllerSourceTypeCamera;
    
    // Displays a control that allows the user to choose picture or
    // movie capture, if both are available:
    cameraUI.mediaTypes =
    [UIImagePickerController availableMediaTypesForSourceType:
     UIImagePickerControllerSourceTypeCamera];
    
    // Hides the controls for moving & scaling pictures, or for
    // trimming movies. To instead show the controls, use YES.
    cameraUI.allowsEditing = NO;
    
    cameraUI.delegate = delegate;
    cameraUI.showsCameraControls=NO;
    cameraUI.navigationBarHidden=YES;
    cameraUI.toolbarHidden=YES;
    cameraUI.wantsFullScreenLayout=YES;
    cameraUI.cameraViewTransform = CGAffineTransformScale(cameraUI.cameraViewTransform, CAMERA_SCALAR_SX, CAMERA_SCALAR_SY);
    UIButton *btnRecording = [UIButton buttonWithType:UIButtonTypeRoundedRect];
    CGRect buttonRect =CGRectMake(190 , 420, 100, 39); // position in the parent view and set the size of the button
    btnRecording.frame = buttonRect;
    [btnRecording setTitle:@"Recording" forState:UIControlStateNormal];
    // add targets and actions
    [btnRecording addTarget:self action:@selector(buttonClicked:) forControlEvents:UIControlEventTouchUpInside];
     cameraUI.cameraOverlayView= btnRecording;
    [controller presentModalViewController: cameraUI animated: YES];
    return YES;
    

    }

    1. 设置AVCapture以逐帧获取图片。
    2.   
          
      • (void)setupCaptureSession {   NSError * error = nil;
      •   
      // Create the session
      AVCaptureSession *session = [[AVCaptureSession alloc] init];
      

      // session.AVCaptureTorchModeOn = YES;

      // Configure the session to produce lower resolution video frames, if your
      // processing algorithm can cope. We'll specify medium quality for the
      // chosen device.
      session.sessionPreset = AVCaptureSessionPresetLow;
      
      // Find a suitable AVCaptureDevice
      AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
      [device lockForConfiguration:nil];
      [device setTorchMode:AVCaptureTorchModeOn];  // use AVCaptureTorchModeOff to turn off
      [device unlockForConfiguration];
      // Create a device input with the device and add it to the session.
      AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
                                                                          error:&error];
      if (!input) {
          // Handling the error appropriately.
      }
      [session addInput:input];
      
      // Create a VideoDataOutput and add it to the session
      AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init] ;
      

      // output.alwaysDiscardsLateVideoFrames = YES;     [session addOutput:output];

      dispatch_queue_t queue = dispatch_queue_create("MyQueue", NULL);
      [output setSampleBufferDelegate:self queue:queue];
      dispatch_release(queue);
      
      // Specify the pixel format
      output.videoSettings =
      [NSDictionary dictionaryWithObject:
       [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
                                  forKey:(id)kCVPixelBufferPixelFormatTypeKey];
      
      
      // If you wish to cap the frame rate to a known value, such as 15 fps, set
      // minFrameDuration.
      

      // output.minFrameDuration = CMTimeMake(1,1);

      // Start the session running to start the flow of data
        NSLog(@"session is going to start at here");
      [session startRunning];
      
      // Assign session to an ivar.
      //[self setSession:session];
      

      }

      //从样本缓冲区数据创建UIImage - (UIImage *)imageFromSampleBuffer:(CMSampleBufferRef)sampleBuffer {       NSLog(@“图片越来越多”);     CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);     //锁定像素缓冲区的基地址     CVPixelBufferLockBaseAddress(ImageBuffer的,0);

      // Get the number of bytes per row for the pixel buffer
      size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
      // Get the pixel buffer width and height
      size_t width = CVPixelBufferGetWidth(imageBuffer);
      size_t height = CVPixelBufferGetHeight(imageBuffer);
      
      // Create a device-dependent RGB color space
      CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
      if (!colorSpace)
      {
          NSLog(@"CGColorSpaceCreateDeviceRGB failure");
          return nil;
      }
      
      // Get the base address of the pixel buffer
      void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
      // Get the data size for contiguous planes of the pixel buffer.
      size_t bufferSize = CVPixelBufferGetDataSize(imageBuffer);
      
      // Create a Quartz direct-access data provider that uses data we supply
      CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, baseAddress, bufferSize,
                                                                NULL);
      // Create a bitmap image from data supplied by our data provider
      CGImageRef cgImage =
      CGImageCreate(width,
                    height,
                    8,
                    32,
                    bytesPerRow,
                    colorSpace,
                    kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Little,
                    provider,
                    NULL,
                    true,
                    kCGRenderingIntentDefault);
      CGDataProviderRelease(provider);
      CGColorSpaceRelease(colorSpace);
      
      // Create and return an image object representing the specified Quartz image
      UIImage *image = [UIImage imageWithCGImage:cgImage];
      CGImageRelease(cgImage);
      
      CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
      
      return image;
      

      }

        

      //写入样本缓冲区时调用的委托例程    - (void)captureOutput:(AVCaptureOutput *)captureOutput   didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer          fromConnection:(AVCaptureConnection *)connection {       //从示例缓冲区数据创建UIImage       NSLog(@“图片越来越多”);       UIImage * image = [self imageFromSampleBuffer:sampleBuffer];

      // [self.delegate cameraCaptureGotFrame:image]; }

      现在委托“captureOutput”没有接到电话。 我不知道我在哪里做错了。这对我有所帮助。提前谢谢。

0 个答案:

没有答案