使用AVCaptureSessionPreset时如何更改图像质量

时间:2013-09-25 10:20:04

标签: ios objective-c image avfoundation avcapturesession

我正在寻找AVFoundation的一些帮助。目前,我已按照有关如何从此主题捕获静止图像的分步指南进行操作:How to save photos taken using AVFoundation to Photo Album?

我的问题是,如何降低保存图像的质量以及使用AVCaptureSessionPreset640x480。我想将图像质量降低一半,或者如果有另一种方法可以使保存的图像尽可能小的文件 - 可能是320x280 - 那么这可能比调整实际质量更好。

我不知道是否还有其他人曾经问过这个问题,但过去几天我一直在网上搜索,但找不到答案。以下是我的代码。

`

#import "ViewController.h"
#import <ImageIO/ImageIO.h>

@interface ViewController ()

@end

@implementation ViewController

@synthesize imagePreview;
@synthesize iImage;
@synthesize stillImageOutput;

-(IBAction)captureNow {
    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in stillImageOutput.connections)
    {
        for (AVCaptureInputPort *port in [connection inputPorts])
        {
            if ([[port mediaType] isEqual:AVMediaTypeVideo] )
            {
                videoConnection = connection;
                break;
            }
        }
        if (videoConnection)
        {
            break;
        }
    }

    NSLog(@"about to request a capture from: %@", stillImageOutput);
    [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
     {
         CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
         if (exifAttachments)
         {
             // Do something with the attachments.
             NSLog(@"attachements: %@", exifAttachments);
         } else {
             NSLog(@"no attachments");
         }

         NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
         //NSData *data2 = [NSData dataWithData:UIImageJPEGRepresentation(image, 0.5f)]];
         UIImage *image = [[UIImage alloc] initWithData:imageData];

         self.iImage.image = image;

         UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
     }];
}

-(void)viewDidAppear:(BOOL)animated
{
    AVCaptureSession *session = [[AVCaptureSession alloc] init];
    session.sessionPreset = AVCaptureSessionPreset640x480;

    CALayer *viewLayer = self.imagePreview.layer;
    NSLog(@"viewLayer = %@", viewLayer);

    AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];

    captureVideoPreviewLayer.frame = self.imagePreview.bounds;
    [self.imagePreview.layer addSublayer:captureVideoPreviewLayer];

    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    NSError *error = nil;
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
    if (!input) {
        // Handle the error appropriately.
        NSLog(@"ERROR: trying to open camera: %@", error);
    }
    [session addInput:input];

    stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
    [stillImageOutput setOutputSettings:outputSettings];
    [session addOutput:stillImageOutput];

    [session startRunning];
}


- (void)viewDidLoad
{
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.
}

- (void)didReceiveMemoryWarning
{
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}

@end

`

2 个答案:

答案 0 :(得分:3)

尝试使用不同的预设,为您提供不同的分辨率图像。

根据Apple文档,对于iPhone4(返回),您将获得该会话的以下预设的以下分辨率图像。

AVCaptureSessionPresetHigh : 1280x720

AVCaptureSessionPresetMedium : 480x360

AVCaptureSessionPresetLow : 192x144

AVCaptureSessionPreset640x480 : 640x480

AVCaptureSessionPreset1280x720 : 1280x720

AVCaptureSessionPresetPhoto : 2592x1936.This is not supported for video output

希望这会有所帮助。

答案 1 :(得分:2)

您需要在kCVPixelBufferWidthKey对象上设置kCVPixelBufferHeightKeyAVCaptureStillImageOutput选项,以设置您选择的分辨率。此宽度/高度将覆盖会话预设宽度/高度。最小样本如下(添加错误检查)。

    _session = [[AVCaptureSession alloc] init];
    _device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    NSError * error;
    _sessionInput = [AVCaptureDeviceInput deviceInputWithDevice:_device error:&error];
    _stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                              [NSNumber numberWithDouble:320.0], (id)kCVPixelBufferWidthKey,
                              [NSNumber numberWithDouble:280.0], (id)kCVPixelBufferHeightKey,
                              [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA], (id)kCVPixelBufferPixelFormatTypeKey,
                              nil];

    [_stillImageOutput setOutputSettings:options];

    [_session beginConfiguration ];
    [_session addInput:_sessionInput];
    [_session addOutput:_stillImageOutput];
    [_session setSessionPreset:AVCaptureSessionPresetPhoto];
     _avConnection = [_stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
    [ _session commitConfiguration ];

.............

- (void) start
{
    [self.session startRunning];
}

.............

[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:self.avConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
 {
     CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(imageSampleBuffer);
     CVPixelBufferLockBaseAddress(imageBuffer, 0);
     size_t width = CVPixelBufferGetWidth(imageBuffer);
     size_t height = CVPixelBufferGetHeight(imageBuffer);
     NSLog(@"%d : %d", height, width);

 }];

注意:我只在Mac上试过这个。理想情况下它也适用于iOS。还要尝试保持一些宽高比。