我遇到了一些问题,我找到了答案,但没有一个问题对我有效。
目标:在尝试检测3个QR码60秒后,提供手动捕获图像并将该图像发送到电子邮件的选项。类似的事情:60秒过去了 - > UIAlertController带有“OK”按钮,可以显示图像捕获视图 - >用户捕获图像,然后将该图像作为附件添加到他们随后可以发送的电子邮件中
问题:找不到使用https://www.toptal.com/machine-learning/real-time-object-detection-using-mser-in-ios
中描述的opencv库捕获图像的方法我尝试过的事情:
在播放视频时截取屏幕截图(有效,但我正在寻找更复杂的方式来执行此操作
(UIImage *)capture {
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *imageView = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return imageView;
}
使用UIImagePickerController(无法显示MFMailComposeViewController。同时继续“快照”尚未渲染的视图会导致空白快照。确保您的视图在屏幕快照或快照之前至少渲染一次更新。“)
(IBAction)takePhoto:(id)sender {
UIImagePickerController *picker = [[UIImagePickerController alloc] init];
picker.allowsEditing = NO;
picker.modalPresentationStyle = UIModalPresentationCurrentContext;
picker.sourceType = UIImagePickerControllerSourceTypeCamera;
[self presentViewController:picker animated:YES completion:nil];
}
(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage *image = [info objectForKey:@"UIImagePickerControllerOriginalImage"];
[self dismissViewControllerAnimated:YES completion:nil];
[self emailImage:image];
}
(void)emailImage:(UIImage *)image {
if ([MFMailComposeViewController canSendMail]) {
MFMailComposeViewController *mailComposeViewController = [[MFMailComposeViewController alloc] init];
mailComposeViewController.mailComposeDelegate = self;
mailComposeViewController.navigationBar.barStyle = UIBarStyleDefault;
mailComposeViewController.modalPresentationStyle = UIModalPresentationPageSheet;
NSString *messageBody = @"";
messageBody = [messageBody stringByAppendingFormat:@"Image Attached"];
NSArray *recipients = [NSArray arrayWithObjects: @"x@x.com", nil];
[mailComposeViewController setMessageBody:messageBody isHTML:NO];
[mailComposeViewController setToRecipients:recipients];
NSData *data = UIImagePNGRepresentation(image);
[mailComposeViewController addAttachmentData:data mimeType:@"image/png" fileName:@"Photo"];
NSString *messageSubject = [NSString stringWithFormat:@"Cannot Scan"];
[mailComposeViewController setSubject:messageSubject];
[self presentViewController:mailComposeViewController animated:YES completion:nil];
}
}
(void)mailComposeController:(MFMailComposeViewController*)controller
didFinishWithResult:(MFMailComposeResult)result
error:(NSError*)error {
NSString *msg1;
switch (result)
{
case MFMailComposeResultCancelled:
msg1 =@"Sending Mail is cancelled";
break;
case MFMailComposeResultSaved:
msg1=@"Sending Mail is Saved";
break;
case MFMailComposeResultSent:
msg1 =@"Your Mail has been sent successfully. We will be responding shortly with your results.";
break;
case MFMailComposeResultFailed:
msg1 =@"Message sending failed";
break;
default:
msg1 =@"Your Mail is not Sent";
break;
}
UIAlertView *mailResultAlert = [[UIAlertView alloc]initWithFrame:CGRectMake(10, 170, 300, 120)];
mailResultAlert.message=msg1;
mailResultAlert.title=@"Message";
[mailResultAlert addButtonWithTitle:@"OK"];
[mailResultAlert show];
dispatch_async(dispatch_get_main_queue(), ^{
[self dismissViewControllerAnimated:YES completion:nil];
[self dismissViewControllerAnimated:YES completion:nil];
});
}
有人知道如何使用openCV videoCamera方法手动捕获图像吗? 或者如何修复UIImagePickerController? 或任何其他替代解决方案?
谢谢!
答案 0 :(得分:0)
我发现了UIImagePickerController的解决方案。
在未检测到3个QR码的60秒后,显示UIAlertController。当" OK"按下按钮,我执行一个segue到另一个视图控制器。
- (void)tooLong
{
if ([self.navigationController.visibleViewController isKindOfClass:[UIAlertController class]]) {
[self.navigationController.visibleViewController dismissViewControllerAnimated:YES completion:nil];
}
[self.contourScanner stopScan];
self.analysisInProgress = NO;
dispatch_async(dispatch_get_main_queue(), ^{
UIAlertController *alert = [UIAlertController alertControllerWithTitle:NSLocalizedString(@"Oops", nil)
message:NSLocalizedString(@"It looks like we cannot detect your cup. Press 'OK' to take a photo manually. This image will be emailed to us to analyze ourselves.", nil)
preferredStyle:UIAlertControllerStyleAlert];
UIAlertAction *action = [UIAlertAction actionWithTitle:NSLocalizedString(@"OK", nil) style:UIAlertActionStyleDefault handler:^(UIAlertAction * _Nonnull action) {
[self performSegueWithIdentifier:@"takePhoto" sender:nil];
}];
[alert addAction:action];
[self presentViewController:alert animated:YES completion:nil];
});
}
然后在新的viewcontroller.m文件中,我创建了一个全局变量(@property(强,非原子)UIImagePickerController * picker; ),然后我添加以下内容以正确拍摄原始图像。
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
if (!self.hasPhotoCaptured) {
[self takePhoto];
}
}
- (void)takePhoto {
self.hasPhotoCaptured = YES;
self.picker = [[UIImagePickerController alloc] init];
self.picker.allowsEditing = NO;
self.picker.modalPresentationStyle = UIModalPresentationCurrentContext;
self.picker.sourceType = UIImagePickerControllerSourceTypeCamera;
self.picker.delegate = self;
self.picker.cameraFlashMode = UIImagePickerControllerCameraFlashModeOff;
self.picker.cameraDevice = UIImagePickerControllerCameraDeviceRear;
[self presentViewController:self.picker animated:YES completion:nil];
}
我最初的问题是我没有设置self.picker.delegate = self;一旦添加,就会调用OP imagePickerController。