我正在构建一个iOS应用程序,它具有全屏ImagePickerController,捕获的结果图像与ImagePickerController视图中显示的相同。这是我目前的相关代码:
创建和转换ImagePickerController:
self.imagePicker = [[UIImagePickerController alloc] init];
self.imagePicker.delegate = self;
CGSize screenSize = [[UIScreen mainScreen] bounds].size;
// set the aspect ratio of the camera
float heightRatio = 4.0f / 3.0f;
// calculate the height of the camera based on the screen width
float cameraHeight = floorf(screenSize.width * heightRatio);
// calculate the ratio that the camera height needs to be scaled by
float scale = ceilf((screenSize.height / cameraHeight) * 10.0) / 10.0;
if ([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera]) {
self.imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera;
self.imagePicker.showsCameraControls = NO;
[self.imagePicker setCameraOverlayView:cameraView];
// move the controller to the center of the screen
self.imagePicker.cameraViewTransform = CGAffineTransformMakeTranslation(0, (screenSize.height - cameraHeight) / 2.0);
// concatenate the scale transform
self.imagePicker.cameraViewTransform = CGAffineTransformScale(self.imagePicker.cameraViewTransform, scale, scale);
}
捕获图像后,这里是我用来重绘捕获图像以匹配预览的代码:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
self.image = [info objectForKey:UIImagePickerControllerOriginalImage];
self.image = [self croppedImage:self.image];
- (UIImage *) croppedImage: (UIImage *)image{
CGSize screenSize = [[UIScreen mainScreen] bounds].size;
// set the aspect ratio of the camera
float heightRatio = 4.0f / 3.0f;
// calculate the height of the camera based on the screen width
float cameraHeight = floorf(screenSize.width * heightRatio);
// calculate the ratio that the camera height needs to be scaled by
float scale = ceilf((screenSize.height / cameraHeight) * 10.0) / 10.0;
CGSize originalImageSize = [image size];
CGSize newImageSize = CGSizeMake(floorf(originalImageSize.width / scale)* 3/4, floorf(originalImageSize.height / scale)* 3/4);
CGRect newImageRect = CGRectMake((originalImageSize.width - newImageSize.width)/2.0, (originalImageSize.height - newImageSize.height)/2.0, newImageSize.width, newImageSize.height);
return [image croppedImage:newImageRect];
}
所以我的问题是我的CroppedImage方法计算不正确,因为生成的图像似乎比需要的更“放大”。不确定计算中出了什么问题。
注意 - 此应用程序旨在在所有iPhone上正确缩放 - 仅限纵向模式。我目前正在iPhone 6上进行测试。
答案 0 :(得分:0)
如果这有助于任何人 - 在我的设备上我能够通过切换来修复它
CGSize newImageSize = CGSizeMake(floorf(originalImageSize.width / scale)* 3/4, floorf(originalImageSize.height / scale)* 3/4);
到
CGSize newImageSize = CGSizeMake(floorf(originalImageSize.width / scale), floorf(originalImageSize.height / scale)* 4/3);
只需要乘以4/3而不是3/4所需的高度/比例。我还没有在任何其他设备上测试过这个。只是认为这可能有助于任何人遇到同样的事情。