我使用IOS7中的一些新AVCapture
API编写了条形码扫描程序。一切都很好,但是我希望在从捕获输出中获取满足数据后抓取图像。下面的方法是我在SKU等上查找的委托,并且也想抓取图像。这种方法有可能这样吗?
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection
{
...
}
答案 0 :(得分:3)
要专门回答您的问题,不,没有办法从AVCaptureMetadataOutput
实例保存图片。
但是,正如编码Voldemort的优秀示例所示,您可以创建AVCaptureStillImageOutput
实例并将其添加到AVCaptureSession
输出中。一旦您的应用检测到某些元数据,您就可以立即在CaptureStillImageOutput实例上触发捕获。
这里有一个更明确的解决方案,使用codingVoldemort的初始代码作为基础:
首先,无论您在哪里建立AVCaptureSession
,都要向其添加AVCaptureStillImageOutput
:
_session = [[AVCaptureSession alloc] init];
_output = [[AVCaptureMetadataOutput alloc] init];
[_output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[_session addOutput:_output];
_stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[_session addOutput:_stillImageOutput];
现在,在- captureOutput: didOutputMetadataObjects
中,您可以在触发方法时捕获静止图像:
AVCaptureConnection *stillImageConnection = [_stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
[stillImageConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
[stillImageConnection setVideoScaleAndCropFactor:1.0f];
[_stillImageOutput setOutputSettings:[NSDictionary dictionaryWithObject:AVVideoCodecJPEG
forKey:AVVideoCodecKey]];
_stillImageOutput.outputSettings = @{AVVideoCodecKey: AVVideoCodecJPEG, AVVideoQualityKey:@1};
[stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (error) {
NSLog(@"error: %@", error);
}
else {
NSData *jpegData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image =[UIImage imageWithData:jpegData];
//Grabbing the image here
dispatch_async(dispatch_get_main_queue(), ^(void) {
//Update UI if necessary.
});
}
}
];
答案 1 :(得分:2)
试试这个方法:
-(void)captureZoomedImage:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
// Find out the current orientation and tell the still image output.
AVCaptureConnection *stillImageConnection = [stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
UIDeviceOrientation curDeviceOrientation = [[UIDevice currentDevice] orientation];
AVCaptureVideoOrientation avcaptureOrientation = [self avOrientationForDeviceOrientation:curDeviceOrientation];
[stillImageConnection setVideoOrientation:avcaptureOrientation];
[stillImageConnection setVideoScaleAndCropFactor:1.0f];
[stillImageOutput setOutputSettings:[NSDictionary dictionaryWithObject:AVVideoCodecJPEG
forKey:AVVideoCodecKey]];
stillImageOutput.outputSettings = @{AVVideoCodecKey: AVVideoCodecJPEG, AVVideoQualityKey:@1};
[stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (error) {
[self displayErrorOnMainQueue:error withMessage:@"Take picture failed"];
}
else {
NSData *jpegData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image =[UIImage imageWithData:jpegData];
//Grabbing the image here
dispatch_async(dispatch_get_main_queue(), ^(void) {
//Update UI if necessary.
});
}
}
];
}
答案 2 :(得分:0)
我想翻译Tim对Swift的回答=) 这是第1部分:
let session = AVCaptureSession()
var metadataOutput = AVCaptureMetadataOutput()
var stillCameraOutput = AVCaptureStillImageOutput()
let sessionQueue = dispatch_async(dispatch_get_main_queue(), nil)
metadataOutput.setMetadataObjectsDelegate(self, queue: sessionQueue)
if session.canAddOutput(metadataOutput) {
session.addOutput(metadataOutput)
}
session.addOutput(stillCameraOutput)
这是第二个:
var image = UIImage()
let stillImageConnection = stillCameraOutput.connectionWithMediaType(AVMediaTypeVideo)
stillImageConnection.videoOrientation = .Portrait
stillImageConnection.videoScaleAndCropFactor = 1.0
stillCameraOutput.captureStillImageAsynchronouslyFromConnection(stillImageConnection, completionHandler: { (imageDataSampleBuffer, error) in
if (error != nil) {
print("There are some error in capturing image")
} else {
let jpegData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
image = UIImage(data: jpegData)!
}
})