我想在我的应用程序中使用下面列出的两种目标c方法。第一种方法是将UIImagePicker
照片上传到本地服务器。
// I would still like to use this method structure but with the `AVCam` classes.
-(void)uploadPhoto {
//upload the image and the title to the web service
[[API sharedInstance] commandWithParams:[NSMutableDictionary dictionaryWithObjectsAndKeys:@"upload", @"command", UIImageJPEGRepresentation(photo.image,70), @"file", fldTitle.text, @"title", nil] onCompletion:^(NSDictionary *json) {
//completion
if (![json objectForKey:@"error"]) {
//success
[[[UIAlertView alloc]initWithTitle:@"Success!" message:@"Your photo is uploaded" delegate:nil cancelButtonTitle:@"Yay!" otherButtonTitles: nil] show];
} else {
//error, check for expired session and if so - authorize the user
NSString* errorMsg = [json objectForKey:@"error"];
[UIAlertView error:errorMsg];
if ([@"Authorization required" compare:errorMsg]==NSOrderedSame) {
[self performSegueWithIdentifier:@"ShowLogin" sender:nil];
}
}
}];
}
我想添加第二种方法:第二种方法使用IBAction
执行AVCam
图片捕捉,但我将其更改为无效以使用[self snapStillImage]
启动视图加载。
编辑
- (IBAction)snapStillImage:(id)sender
{
dispatch_async([self sessionQueue], ^{
// Update the orientation on the still image output video connection before capturing.
[[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
// Flash set to Auto for Still Capture
[ViewController5 setFlashMode:AVCaptureFlashModeAuto forDevice:[[self videoDeviceInput] device]];
// Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil];
//
photo = [[UIImage alloc] initWithData:imageData];
}
}];
});
}
有人可以通过photo
设置AVCam
吗?至少幽默我,并开始就AVFoundation
及其适当的课程进行对话,以解决这样的问题。
其他信息:avcam方法只是这个https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html
的摘录 @ Aksh1t我想设置一个UIImage
命名图像,其中包含AVFoundation
按钮的原始内容。不是UIImagePicker
。以下是使用UIImagePicker
设置插座的方法。
#pragma mark - Image picker delegate methods
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
// Resize the image from the camera
UIImage *scaledImage = [image resizedImageWithContentMode:UIViewContentModeScaleAspectFill bounds:CGSizeMake(photo.frame.size.width, photo.frame.size.height) interpolationQuality:kCGInterpolationHigh];
// Crop the image to a square (yikes, fancy!)
UIImage *croppedImage = [scaledImage croppedImage:CGRectMake((scaledImage.size.width -photo.frame.size.width)/2, (scaledImage.size.height -photo.frame.size.height)/2, photo.frame.size.width, photo.frame.size.height)];
// Show the photo on the screen
photo.image = croppedImage;
[picker dismissModalViewControllerAnimated:NO];
}
之后我只想使用我发布的第一种方法上传它。很抱歉不清楚。我基本上想在我的新应用程序中执行此操作(我不清楚应用程序是什么)。
AVCam
UIImageView
IBOutlet
命名照片AVCam
照片)上传到服务器基本框架在上面,我将回答任何问题
答案 0 :(得分:1)
snapStillImage
方法中的以下代码行将照片放入imageData
变量。
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
接下来,您将从此数据中创建一个UIImage
对象,如此
UIImage *image = [[UIImage alloc] initWithData:imageData];
创建一个全局变量UIImage *photo;
而不是上面的代码,并在imageData
拍摄照片时使用snapStillImage
初始化
photo = [[UIImage alloc] initWithData:imageData];
由于photo
是一个全局变量,您可以在uploadPhoto
方法中使用它并将其发送到您的服务器。
希望这会有所帮助,如果您有任何疑问,请将其留在评论中。
<小时/> 编辑:
由于您的文件中已有IBOutlet UIImageView *photo;
,因此您甚至不需要全局变量来存储UIImage
。您只需在snapStillImage
方法中替换以下行:
UIImage *image = [[UIImage alloc] initWithData:imageData];
这一行
photo.image = [[UIImage alloc] initWithData:imageData];