我使用的代码类似于此博客http://blog.logichigh.com/2008/06/05/uiimage-fix/
用iPhone相机拍摄后旋转图像。我正在使用AVFoundation
。
我在这里提取了相关代码:
case UIImageOrientationUp: //EXIF = 1
transform = CGAffineTransformIdentity;
break;
case UIImageOrientationUpMirrored: //EXIF = 2
transform = CGAffineTransformMakeTranslation(imageSize.width, 0.0);
transform = CGAffineTransformScale(transform, -1.0, 1.0);
break;
case UIImageOrientationDown: //EXIF = 3
transform = CGAffineTransformMakeTranslation(imageSize.width, imageSize.height);
transform = CGAffineTransformRotate(transform, M_PI);
break;
case UIImageOrientationDownMirrored: //EXIF = 4
transform = CGAffineTransformMakeTranslation(0.0, imageSize.height);
transform = CGAffineTransformScale(transform, 1.0, -1.0);
break;
case UIImageOrientationLeftMirrored: //EXIF = 5
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(imageSize.height, imageSize.width);
transform = CGAffineTransformScale(transform, -1.0, 1.0);
transform = CGAffineTransformRotate(transform, 3.0 * M_PI / 2.0);
break;
case UIImageOrientationLeft: //EXIF = 6
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(0.0, imageSize.width);
transform = CGAffineTransformRotate(transform, 3.0 * M_PI / 2.0);
break;
case UIImageOrientationRightMirrored: //EXIF = 7
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeScale(-1.0, 1.0);
transform = CGAffineTransformRotate(transform, M_PI / 2.0);
break;
case UIImageOrientationRight: //EXIF = 8
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(imageSize.height, 0.0);
transform = CGAffineTransformRotate(transform, M_PI / 2.0);
break;
当手机放在X
或Y
轴上时,此功能正常。
但是,当我将手机放在Z
轴上时。它始终显示UIImage
有EXIF = 2
。
我知道我可以使用加速度计来判断设备何时在Z
轴上。但是,我无法看到一条路径,可以让我区分拍摄时的图像,并标记为已标记,因为它们仍有EXIF = 2
。
即。它将允许我区分在Z上拍摄的照片。但它不允许我区分照片本身,例如Landscape1(左侧的iPhone Home按钮,Portrait,Landscape2(右侧的iPhone Home按钮)
答案 0 :(得分:0)
EXIF数据仅报告X-Y方向。 EXIF数据中没有任何内容可以告诉您相机是朝上还是朝下。您可以在捕获图像时抓取设备方向:
[[UIDevice currentDevice] orientation]
然后,您只需要跟踪图像及其上/下方向,与EXIF数据分开。如果您将图像存储在应用程序中,则可以使用简单的数据库表,甚至是序列化的NSDictionary,其图像名称为键,上/下方向为值。
答案 1 :(得分:0)
我遇到了类似的问题。我根据设备方向在某些位置将屏幕定位在屏幕上。但是,如果设备平放而没有显然是横向或纵向,则[[UIDevice currentDevice]
方向的方向]不可靠。我最终使用[[[UIApplication] sharedApplication] statusBarOrientation]
来解决这个问题,{{1}}将始终返回屏幕上显示状态栏的当前方向。我在使用Xamarin和C#的iOS应用程序方面有更多的经验,所以请原谅我,如果我的ObjectiveC稍微偏离。