我已经知道如何使用UIImagePickerController
,但我实际上在我的项目中使用AVFoundation
。
我是否应该使用ImagePickerController
的委托,即使我正在使用AVFoundation
,还是有其他方法可以使用{
}?
对于记录,可定制性越高越好,因为这就是我选择AVFoundation
而不是UIImagePickerController
的原因,例如,能够更改其布局,布局颜色,框架等等。 / p>
答案 0 :(得分:0)
Apple有一些有用的文档可能会帮到你。
我的猜测是你只需要正确的关键词来寻找。
如果你想支持iOS 8&较旧的iOS版本,您想要做的是get access to some photo in the library as an AVAsset
。
该文档中的第一个代码清单显示了如何从用户的相册中获取第一个视频。要获取第一张照片,只需将“[ALAssetsFilter allVideos]
”位替换为“[ALAssetsFilter allPhotos]
”即可。
如果您只想支持iOS 9及更高版本there's a new Photos framework,并且资产的访问权限为PHAssets
。
答案 1 :(得分:0)
可以使用ALAssetsLibrary类访问Photos应用程序可用的任何图像。请注意,您将为您完成工作;您必须创建用于从头开始从照片库中拾取图像的UI。
答案 2 :(得分:0)
好的,如果您绝对需要自定义相机的UI元素,但同时您不想进入AVAsset来调出照片库,那么这就是您所做的。
使用AVFoundation自定义相机并使用UIImagePickerController进行照片库。
在自定义相机视图中
- (instancetype)initWithFrame:(CGRect)frame{
self = [super initWithFrame:frame];
if (self) {
self.session = [[AVCaptureSession alloc] init];
[self.session setSessionPreset:AVCaptureSessionPresetPhoto];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
self.deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ([self.session canAddInput:self.deviceInput]) {
[self.session addInput:self.deviceInput];
}
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [self layer];
[rootLayer setMasksToBounds:YES];
CGRect frame = CGRectMake(0, 0, self.frame.size.width, self.frame.size.width);
[previewLayer setFrame:frame];
[rootLayer insertSublayer:previewLayer atIndex:0];
self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[self.stillImageOutput setOutputSettings:outputSettings];
[self.session addOutput:self.stillImageOutput];
[self.session startRunning];
}
return self;
}
-(void)takePhoto:(UITapGestureRecognizer *)tap{
[UIView animateWithDuration:0.15
delay:0.0
options:UIViewAnimationOptionCurveEaseOut
animations:^{
self.containerView.frame = CGRectMake(-self.frame.size.width, self.containerView.frame.origin.y, self.containerView.frame.size.width, self.containerView.frame.size.height);
} completion:^(BOOL finished){
if (finished) {
}
}
];
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in self.stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
videoConnection = connection;
break;
}
}
}
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [UIImage imageWithData:imageData];
NSLog(@"1. newSize.width : %f, newSize.height : %f",image.size.width,image.size.height);
[self.session stopRunning];
[self shouldUploadImage:image];
}
}];
}
-(void)photoAlbumTapped{
NSLog(@"photo album button tapped");
[self.delegate callImagePickerDelegate:self];
}
在VC中
-(void)callImagePickerDelegate:(CameraView *)sender{
UIImagePickerController *picker = [[UIImagePickerController alloc] init];
picker.delegate = self;
picker.allowsEditing = YES;
picker.sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum;
[self presentViewController:picker animated:YES completion:NULL];
}