我正在尝试使用新的AVFoundation framework
来拍摄iPhone的静态照片。
按下按钮,调用此方法。我可以听到快门声,但我看不到日志输出。如果我多次调用此方法,相机预览将冻结。
是否有任何教程如何使用captureStillImageAsynchronouslyFromConnection
?
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:
[[self stillImageOutput].connections objectAtIndex:0]
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer,
NSError *error) {
NSLog(@"inside");
}];
- (void)initCapture { AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:nil]; AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init]; captureOutput.alwaysDiscardsLateVideoFrames = YES; dispatch_queue_t queue; queue = dispatch_queue_create("cameraQueue", NULL); [captureOutput setSampleBufferDelegate:self queue:queue]; dispatch_release(queue); NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; [captureOutput setVideoSettings:videoSettings]; self.captureSession = [[AVCaptureSession alloc] init]; self.captureSession.sessionPreset = AVCaptureSessionPresetLow; [self.captureSession addInput:captureInput]; [self.captureSession addOutput:captureOutput]; self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: self.captureSession]; [self.prevLayer setOrientation:AVCaptureVideoOrientationLandscapeLeft]; self.prevLayer.frame = CGRectMake(0.0, 0.0, 480.0, 320.0); self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; [self.view.layer addSublayer: self.prevLayer]; // Setup the default file outputs AVCaptureStillImageOutput *_stillImageOutput = [[[AVCaptureStillImageOutput alloc] init] autorelease]; NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil]; [_stillImageOutput setOutputSettings:outputSettings]; [outputSettings release]; [self setStillImageOutput:_stillImageOutput]; if ([self.captureSession canAddOutput:stillImageOutput]) { [self.captureSession addOutput:stillImageOutput]; } [self.captureSession commitConfiguration]; [self.captureSession startRunning]; }
答案 0 :(得分:62)
经过大量的反复试验,我找到了如何做到这一点。
提示:Apple的官方文档 - 简单 - 错误。他们给你的代码实际上并不起作用。
我在这里写了一步一步的说明:
链接上有很多代码,但总结如下:
-(void) viewDidAppear:(BOOL)animated
{
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
CALayer *viewLayer = self.vImagePreview.layer;
NSLog(@"viewLayer = %@", viewLayer);
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.vImagePreview.bounds;
[self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(@"ERROR: trying to open camera: %@", error);
}
[session addInput:input];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];
}
-(IBAction) captureNow
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
NSLog(@"about to request a capture from: %@", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(@"attachements: %@", exifAttachments);
}
else
NSLog(@"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
self.vImage.image = image;
}];
}
答案 1 :(得分:16)
当4.0仍然处于测试阶段时,我们遇到了这个问题。我尝试了很多东西。这是:
我最终只是拍摄视频帧。 “拍照”按钮只是设置一个标志;在视频帧回调中,如果设置了标志,则返回视频帧而不是UIImage *。这足以满足我们的图像处理需求 - “拍照”主要存在,因此用户可以获得否定响应(以及提交错误报告的选项);我们实际上并不想要2/3/5百万像素图像,因为它们需要很长时间才能处理。
如果视频帧不够好(即你想要在高分辨率图像捕捉之间捕捉取景器帧),我首先要看看他们是否已使用多个AVCapture会话进行修复,因为这是你设置两者的唯一方法预置。
这可能值得归档一个错误。我在4.0 GM的发布时提出了一个错误; Apple问我一些示例代码,但到那时我决定使用视频帧解决方法并发布一个版本。
此外,“低”预设非常低分辨率(并导致低分辨率,低帧率视频预览)。如果可用的话,我会选择640x480,否则会回到中等。
答案 2 :(得分:6)
这是一个巨大的帮助 - 我被困在杂草中一段时间试图遵循AVCam的例子。
这是一个完整的工作项目,我的评论可以解释发生了什么。 这说明了如何使用具有多个输出的捕获管理器。 在这个例子中有两个输出。
第一个是上面例子的静止图像输出。
第二个提供逐帧访问来自摄像机的视频。 如果您愿意,可以添加更多代码以使用框架执行一些有趣的操作。 在这个例子中,我只是在委托回调中更新屏幕上的帧计数器。
答案 3 :(得分:3)
Apple有一些注释和示例代码:
Technical Q&A QA1702: How to capture video frames from the camera as images using AV Foundation
答案 4 :(得分:0)
你应该使用 Adam 的答案,但如果你使用Swift(就像你现在大多数人一样),这里是他的代码的Swift 1.2端口:
import ImageIO
private var stillImageOutput: AVCaptureStillImageOutput!
stillImageOutput
:captureSession.startRunning()
醇>
像这样:
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
captureSession.addOutput(stillImageOutput)
然后使用此代码捕获图像:
private func captureImage() {
var videoConnection: AVCaptureConnection?
for connection in stillImageOutput.connections as! [AVCaptureConnection] {
for port in connection.inputPorts {
if port.mediaType == AVMediaTypeVideo {
videoConnection = connection
break
}
}
if videoConnection != nil {
break
}
}
print("about to request a capture from: \(stillImageOutput)")
stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection) { (imageSampleBuffer: CMSampleBuffer!, error: NSError!) -> Void in
let exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, nil)
if let attachments = exifAttachments {
// Do something with the attachments
print("attachments: \(attachments)")
} else {
print("no attachments")
}
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageSampleBuffer)
let image = UIImage(data: imageData)
// Do something with the image
}
}
这一切都假设您已经设置了AVCaptureSession
,只需要从中获取静止,就像我一样。