如何使用AVFoundation框架使用iPhone相机拍照?

时间:2014-03-15 20:34:59

标签: ios objective-c avfoundation avcapturesession ios-camera

我需要iOS相机在没有用户输入的情况下拍照。我该怎么做呢?到目前为止,这是我的代码:

-(void)initCapture{
    AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] init];

    AVCaptureStillImageOutput *newStillImageOutput = [[AVCaptureStillImageOutput alloc] init];

    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:
                                AVVideoCodecJPEG, AVVideoCodecKey,
                                nil];


    [newStillImageOutput setOutputSettings:outputSettings];


    AVCaptureSession *newCaptureSession = [[AVCaptureSession alloc] init];

    if ([newCaptureSession canAddInput:newVideoInput]) {
        [newCaptureSession addInput:newVideoInput];
    }
    if ([newCaptureSession canAddOutput:newStillImageOutput]) {
        [newCaptureSession addOutput:newStillImageOutput];
    }
    self.stillImageOutput = newStillImageOutput;
}

我还需要添加什么,以及从哪里开始?我不想拍摄视频,只拍摄一张静止图片。另外,之后如何将图像转换为UIImage?感谢

1 个答案:

答案 0 :(得分:0)

根据this tutorial,您应该可以这样做:

NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation: newStillImageOutput];
UIImage *image = [[UIImage alloc] initWithData:imageData];

还有你的UIImage对象!