从iOS iphone中的相机返回的图像中读取GPS数据

时间:2012-04-24 16:42:31

标签: ios geolocation gps camera

我需要获取使用iOS设备相机拍摄的图像的GPS坐标。我不关心相机胶卷图像,只关心使用UIImagePickerControllerSourceTypeCamera拍摄的图像。

我已经阅读了许多stackoverflow答案,例如Get Exif data from UIImage - UIImagePickerController,它们假定您使用的是AssetsLibrary框架,它似乎不适用于相机图像,或者使用CoreLocaiton从中获取纬度/经度应用程序本身,而不是图像。

使用CoreLocation 不是一个选项。按下快门按钮时,这不会给我坐标。 (使用基于CoreLocation的解决方案,您需要在启动摄像机视图之前或之后记录坐标,当然,如果设备正在移动,坐标将是错误的。此方法应该适用于固定设备。)

我只是iOS5,所以我不需要支持旧设备。这也适用于商业产品,因此我无法使用http://code.google.com/p/iphone-exif/

那么,从iOS5中相机返回的图像中读取GPS数据有哪些选择?我现在能想到的只是将图像保存到相机胶卷,然后使用AssetsLibrary,但这看起来很糟糕。

谢谢!


这是我根据Caleb的答案编写的代码。

    UIImage *image =  [info objectForKey:UIImagePickerControllerOriginalImage];

    NSData *jpeg = UIImageJPEGRepresentation(image,1.0);
    CGImageSourceRef  source ;
    source = CGImageSourceCreateWithData((__bridge CFDataRef)jpeg, NULL);

    NSDictionary *metadataNew = (__bridge NSDictionary *) CGImageSourceCopyPropertiesAtIndex(source,0,NULL);  

    NSLog(@"%@",metadataNew);

我的控制台显示:

    2012-04-26 14:15:37:137 ferret[2060:1799] {
        ColorModel = RGB;
        Depth = 8;
        Orientation = 6;
        PixelHeight = 1936;
        PixelWidth = 2592;
        "{Exif}" =     {
            ColorSpace = 1;
            PixelXDimension = 2592;
            PixelYDimension = 1936;
        };
        "{JFIF}" =     {
            DensityUnit = 0;
            JFIFVersion =         (
                1,
                1
            );
            XDensity = 1;
            YDensity = 1;
        };
        "{TIFF}" =     {
            Orientation = 6;
        };
    }

没有纬度/经度。

9 个答案:

答案 0 :(得分:16)

问题是因为iOS 4 UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];剥离了地理位置。要解决此问题,您必须使用原始照片路径来访问完整的图像元数据。有这样的事情:

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
    NSURL *referenceURL = [info objectForKey:UIImagePickerControllerReferenceURL];
    ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
    [library assetForURL:referenceURL resultBlock:^(ALAsset *asset) {
        ALAssetRepresentation *rep = [asset defaultRepresentation];
        NSDictionary *metadata = rep.metadata;
        NSLog(@"%@", metadata);

        CGImageRef iref = [rep fullScreenImage] ;

        if (iref) {
            self.imageView.image = [UIImage imageWithCGImage:iref];
        }
    } failureBlock:^(NSError *error) {
        // error handling
    }];

输出应该是这样的:

{
    ColorModel = RGB;
    DPIHeight = 72;
    DPIWidth = 72;
    Depth = 8;
    Orientation = 6;
    PixelHeight = 1936;
    PixelWidth = 2592;
    "{Exif}" =     {
        ApertureValue = "2.970854";
        BrightnessValue = "1.115874";
        ColorSpace = 1;
        ComponentsConfiguration =         (
            0,
            0,
            0,
            1
        );
        DateTimeDigitized = "2012:07:14 21:55:05";
        DateTimeOriginal = "2012:07:14 21:55:05";
        ExifVersion =         (
            2,
            2,
            1
        );
        ExposureMode = 0;
        ExposureProgram = 2;
        ExposureTime = "0.06666667";
        FNumber = "2.8";
        Flash = 24;
        FlashPixVersion =         (
            1,
            0
        );
        FocalLength = "3.85";
        ISOSpeedRatings =         (
            200
        );
        MeteringMode = 5;
        PixelXDimension = 2592;
        PixelYDimension = 1936;
        SceneCaptureType = 0;
        SensingMethod = 2;
        Sharpness = 2;
        ShutterSpeedValue = "3.9112";
        SubjectArea =         (
            1295,
            967,
            699,
            696
        );
        WhiteBalance = 0;
    };
    "{GPS}" =     {
        Altitude = "1167.528";
        AltitudeRef = 0;
        ImgDirection = "278.8303";
        ImgDirectionRef = T;
        Latitude = "15.8235";
        LatitudeRef = S;
        Longitude = "47.99416666666666";
        LongitudeRef = W;
        TimeStamp = "00:55:04.59";
    };
    "{TIFF}" =     {
        DateTime = "2012:07:14 21:55:05";
        Make = Apple;
        Model = "iPhone 4";
        Orientation = 6;
        ResolutionUnit = 2;
        Software = "5.1.1";
        XResolution = 72;
        YResolution = 72;
        "_YCbCrPositioning" = 1;
    };
}

答案 1 :(得分:12)

我们已经使用相机和UIImagePickerController进行了大量工作,至少在iOS 5.1.1中,它不会在使用{{1}拍摄的照片或视频的元数据中返回位置数据}。

是否为Camera应用程序启用了位置服务并不重要;这可以控制相机应用对位置服务的使用,而不是UIImagePickerController中的相机功能。

您的应用需要使用UIImagePickerController类来获取位置,然后将其添加到从相机返回的图像或视频中。您的应用是否可以获取该位置取决于用户是否授权访问您的应用的位置服务。请注意,用户可以通过CLLocation服务随时为您的app(或完全针对设备)停用位置服务。

答案 2 :(得分:4)

您没有在您发布的代码中使用来自相机的图像数据,您已经生成了它的JPEG表示,这实际上会丢弃所有元数据。像Caleb建议的那样使用image.CGImage

此外:

  

这也适用于商业产品,因此我无法使用http://code.google.com/p/iphone-exif/

作者非常清楚地表明可以获得商业许可。

答案 3 :(得分:4)

一种可能性是在相机可见时让CoreLocation保持运行。将每个CCLocation记录到一个数组中,同时记录样本的时间。当照片回来时,找到它的时间,然后匹配阵列中最近的CClocation。

听起来很糟糕,但它会奏效。

答案 4 :(得分:3)

不能说我需要在我自己的东西中做到这一点,但从文档中可以很清楚地看到,如果你使用UIImagePickerController,你可以获得用户刚刚从中获取的图像-imagePicker:didFinishPickingMediaWithInfo:委托方法。使用键UIImagePickerControllerOriginalImage获取图像。

获得图像后,您应该能够访问其属性,包括EXIF数据,如QA1654 Accessing image properties with ImageIO中所述。要创建CGImageSource,我会查看CGImageSourceCreateWithData()并使用您从UIImage的CGImage方法获得的数据。获得图像源后,您可以通过CGImageSourceCopyProperties()访问各种属性。

答案 5 :(得分:1)

正如Chris Markle指出的那样,Apple确实从EXIF中删除了GPS数据。但您可以打开图像的RAW数据,并自行解析数据或使用第三方库来执行example

以下是示例代码:

- (void) imagePickerController: (UIImagePickerController *) picker
 didFinishPickingMediaWithInfo: (NSDictionary *) info {

    ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
    [library assetForURL:[info objectForKey:UIImagePickerControllerReferenceURL]
             resultBlock:^(ALAsset *asset) {

                 ALAssetRepresentation *image_representation = [asset defaultRepresentation];
                 NSUInteger size = (NSUInteger)image_representation.size;
                 // create a buffer to hold image data
                 uint8_t *buffer = (Byte*)malloc(size);
                 NSUInteger length = [image_representation getBytes:buffer fromOffset: 0.0  length:size error:nil];

                 if (length != 0)  {

                     // buffer -> NSData object; free buffer afterwards
                     NSData *adata = [[NSData alloc] initWithBytesNoCopy:buffer length:size freeWhenDone:YES];

                     EXFJpeg* jpegScanner = [[EXFJpeg alloc] init];
                     [jpegScanner scanImageData: adata];
                     EXFMetaData* exifData = jpegScanner.exifMetaData;

                     id latitudeValue = [exifData tagValue:[NSNumber numberWithInt:EXIF_GPSLatitude]];
                     id longitudeValue = [exifData tagValue:[NSNumber numberWithInt:EXIF_GPSLongitude]];
                     id datetime = [exifData tagValue:[NSNumber numberWithInt:EXIF_DateTime]];
                     id t = [exifData tagValue:[NSNumber numberWithInt:EXIF_Model]];

                     self.locationLabel.text = [NSString stringWithFormat:@"Local: %@ - %@",latitudeValue,longitudeValue];
                     self.dateLavel.text = [NSString stringWithFormat:@"Data: %@", datetime];

                 }
                 else {
                     NSLog(@"image_representation buffer length == 0");
                 }
             }
            failureBlock:^(NSError *error) {
                NSLog(@"couldn't get asset: %@", error);
            }
     ];
}

答案 6 :(得分:0)

在您的UIImagePickerController委托中,执行以下操作:

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
  NSDictionary *metadata = [info valueForKey:UIImagePickerControllerMediaMetadata];

  // metadata now contains all the image metadata.  Extract GPS data from here.
}

答案 7 :(得分:0)

这是在iOS 8上进行测试,适用于视频,因此对于带有一些调整的照片,它应该会有类似的效果。

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {

    NSURL *videoUrl = (NSURL *)[info objectForKey:UIImagePickerControllerMediaURL];
    NSString *moviePath = [videoUrl path];

    if ( UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(moviePath) ) {

        ALAssetsLibrary *assetLibrary = [[ALAssetsLibrary alloc] init];

        [assetLibrary assetForURL:[info objectForKey:UIImagePickerControllerReferenceURL] resultBlock:^(ALAsset *asset) {

            CLLocation *location = [asset valueForProperty:ALAssetPropertyLocation];
            NSLog(@"Location Meta: %@", location);

        } failureBlock:^(NSError *error) {
            NSLog(@"Video Date Error: %@", error);
        }];

    }

}

答案 8 :(得分:0)

Swift 回答:

import AssetsLibrary
import CoreLocation


// MARK: - UIImagePickerControllerDelegate
extension ViewController: UIImagePickerControllerDelegate {
    func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
        defer {
            dismiss(animated: true, completion: nil)
        }
        guard picker.sourceType == .photoLibrary else {
            return
        }
        guard let url = info[UIImagePickerControllerReferenceURL] as? URL else {
            return
        }

        let library = ALAssetsLibrary()
        library.asset(for: url, resultBlock: { (asset) in
            guard let coordinate = asset?.value(forProperty: ALAssetPropertyLocation) as? CLLocation else {
                return
            }
            print("\(coordinate)")

            // Getting human-readable address.
            let geocoder = CLGeocoder()
            geocoder.reverseGeocodeLocation(coordinate, completionHandler: { (placemarks, error) in
                guard let placemark = placemarks?.first else {
                    return
                }
                print("\(placemark.addressDictionary)")
            })
        }, failureBlock: { (error: Error?) in
            print("Unable to read metadata: \(error)")
        })
    }
}