Objective-C:使用NSData从电影中获取PNG缩略图

时间:2015-10-10 08:02:53

标签: ios objective-c nsdata

我有以下代码尝试从NSData获取视频文件的屏幕截图。我可以确认NSData是有效的而不是nil,但dataString和movieURL都返回nil。

- (UIImage *)imageFromMovie:(NSData *)movieData {

    // set up the movie player
    NSString *dataString = [[NSString alloc] initWithData:movieData encoding:NSUTF8StringEncoding];
    NSURL *movieURL = [NSURL URLWithString:dataString];

    // get the thumbnail
    AVURLAsset *asset1 = [[AVURLAsset alloc] initWithURL:movieURL options:nil];
    AVAssetImageGenerator *generate1 = [[AVAssetImageGenerator alloc] initWithAsset:asset1];
    generate1.appliesPreferredTrackTransform = YES;
    NSError *err = NULL;
    CMTime time = CMTimeMake(1, 2);
    CGImageRef oneRef = [generate1 copyCGImageAtTime:time actualTime:NULL error:&err];
    UIImage *one = [[UIImage alloc] initWithCGImage:oneRef];

    return(one);

}

编辑:以下是我从UIImagePicker获取NSData的位置/方式

if ([mediaType isEqualToString:@"ALAssetTypeVideo"]) {

    ALAssetsLibrary *assetLibrary=[[ALAssetsLibrary alloc] init];
    [assetLibrary assetForURL:[[info objectAtIndex:x] valueForKey:UIImagePickerControllerReferenceURL] resultBlock:^(ALAsset *asset) {

    ALAssetRepresentation *rep = [asset defaultRepresentation];

        unsigned long DataSize = (unsigned long)[rep size];
    Byte *buffer = (Byte*)malloc(DataSize);
        NSUInteger buffered = [rep getBytes:buffer fromOffset:0.0 length:DataSize error:nil];

    //here’s the NSData
    NSData *data = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];



    } failureBlock:^(NSError *err) {
             NSLog(@"Error: %@",[err localizedDescription]);
        }];
}

3 个答案:

答案 0 :(得分:0)

可能,您遇到编码问题。

如果数据不代表有效的编码数据,NSString实例方法-(id)initWithData:data:encoding将返回nil。(https://developer.apple.com/library/mac/documentation/Cocoa/Reference/Foundation/Classes/NSString_Class/#//apple_ref/occ/instm/NSString/initWithData:encoding:

尝试在-(id)initWithData:data:encoding方法中使用正确的编码。

答案 1 :(得分:0)

您正在尝试将电影数据转换为NSURL,这就是您获得nil网址的原因。

在您的实施中,您可以通过以下方式获取缩略图:

AVURLAsset *asset1 = [[AVURLAsset alloc] initWithURL:[[info objectAtIndex:x] valueForKey:UIImagePickerControllerReferenceURL] options:nil];
AVAssetImageGenerator *generate1          = [[AVAssetImageGenerator alloc] initWithAsset:asset1];
generate1.appliesPreferredTrackTransform = YES;
NSError *err                             = NULL;
CMTime time                              = CMTimeMake(1, 2);
CGImageRef oneRef                        = [generate1 copyCGImageAtTime:time actualTime:NULL error:&err];
UIImage *one                             = [[UIImage alloc] initWithCGImage:oneRef];

答案 2 :(得分:0)

在阅读以下答案之前下载我的示例项目:

https://drive.google.com/open?id=0B_exgT43OZJOWl9HMDJCR0cyTW8

我知道你发布这个问题已经很久了;但是,我发现它,可以回答它,而且我有理由相信,除非你使用Apple Developer Connection网站提供的样本代码做你所要求的,你仍然需要回答。我完全基于这个事实:很难弄清楚。

尽管如此,我有一个基本的,有效的项目来解决你的问题;但是,在查看之前,请查看我在iPhone 6s Plus上运行的视频:



<iframe width="640" height="360" src="https://www.youtube.com/embed/GiF-FFKvy5M?rel=0&amp;controls=0&amp;showinfo=0" frameborder="0" allowfullscreen></iframe>
&#13;
&#13;
&#13;

如您所见,我iPhone手机视频集中每个资产的海报框都显示在UICollectionViewCell中;在UICollectionViewController(或UICollectionView / datasource委托:

中)
void (^renderThumbnail)(NSIndexPath *, CustomCell *) = ^(NSIndexPath *indexPath, CustomCell *cell) {
[[PHImageManager defaultManager] requestAVAssetForVideo:AppDelegate.assetsFetchResults[indexPath.section] options:nil resultHandler:^(AVAsset * _Nullable asset, AVAudioMix * _Nullable audioMix, NSDictionary * _Nullable info) {
    cell.asset = [asset copy];
    cell.frameTime = [NSValue valueWithCMTime:kCMTimeZero];
}];
};

- (UICollectionViewCell *)collectionView:(UICollectionView *)collectionView cellForItemAtIndexPath:(NSIndexPath *)indexPath {
    PHAsset *phAsset = AppDelegate.assetsFetchResults[indexPath.section];
    CustomCell *cell = [collectionView dequeueReusableCellWithReuseIdentifier:CellReuseIdentifier forIndexPath:indexPath];
    cell.representedAssetIdentifier = phAsset.localIdentifier;
    CGFloat hue = (CGFloat)indexPath.section / 5;
    cell.backgroundColor = [UIColor colorWithHue:hue saturation:1.0f brightness:0.5f alpha:1.0f];

    if ([cell.representedAssetIdentifier isEqualToString:phAsset.localIdentifier]) {
        NSPurgeableData *data = [self.thumbnailCache objectForKey:phAsset.localIdentifier];
        [data beginContentAccess];
        UIImage *image = [UIImage imageWithData:data];
        if (image != nil) {
            cell.contentView.layer.contents = (__bridge id)image.CGImage;
            NSLog(@"Cached image found");
        } else {
            renderThumbnail(indexPath, cell);
        }
        [data endContentAccess];
        [data discardContentIfPossible];



    }

    // Request an image for the asset from the PHCachingImageManager.
    /*[AppDelegate.imageManager requestImageForAsset:phAsset
     targetSize:cell.contentView.bounds.size
     contentMode:PHImageContentModeAspectFill
     options:nil
     resultHandler:^(UIImage *result, NSDictionary *info) {
     // Set the cell's thumbnail image if it's still showing the same asset.
     if ([cell.representedAssetIdentifier isEqualToString:phAsset.localIdentifier]) {
     cell.thumbnailImage = result;
     }
     }];*/

    return cell;
}

在UICollectionViewCell子类中:

@implementation CustomCell

- (void)prepareForReuse {
    [super prepareForReuse];
    _asset = nil;
    _frameTime = nil;
    _thumbnailImage = nil;
    [self.contentView.layer setContents:nil];
    [[self contentView] setContentMode:UIViewContentModeScaleAspectFit];
    [[self contentView] setClipsToBounds:YES];
}

- (void)dealloc {

}

- (void)setAsset:(AVAsset *)asset {
    _asset = asset;
}

- (void)setFrameTime:(NSValue *)frameTime {
    _frameTime = frameTime;
    dispatch_queue_t concurrentQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0);
    dispatch_async(concurrentQueue, ^{
        AVAssetImageGenerator *imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:_asset];
        imageGenerator.appliesPreferredTrackTransform = YES;
        imageGenerator.requestedTimeToleranceAfter = kCMTimeZero;
        imageGenerator.requestedTimeToleranceBefore = kCMTimeZero;
            [imageGenerator generateCGImagesAsynchronouslyForTimes:@[frameTime] completionHandler:^(CMTime requestedTime, CGImageRef  _Nullable image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError * _Nullable error) {
                dispatch_sync(dispatch_get_main_queue(), ^{
                    self.thumbnailImage = [UIImage imageWithCGImage:image scale:25.0 orientation:UIImageOrientationUp];
                });
            }];
    });
}

- (void)setThumbnailImage:(UIImage *)thumbnailImage {
    _thumbnailImage = thumbnailImage;
    self.contentView.layer.contents = (__bridge id)_thumbnailImage.CGImage;
}

@end

NSCache设置如下:

self.thumbnailCache = [[NSCache alloc] init];
    self.thumbnailCache.name = @"Thumbnail Cache";
    self.thumbnailCache.delegate = self;
    self.thumbnailCache.evictsObjectsWithDiscardedContent = true;
    self.thumbnailCache.countLimit = AppDelegate.assetsFetchResults.count;

PHAssets以这种方式获得:

- (PHFetchResult *)assetsFetchResults {
    __block PHFetchResult *i = self->_assetsFetchResults;
    if (!i) {
        static dispatch_once_t onceToken;
        dispatch_once(&onceToken, ^{
            PHFetchResult *smartAlbums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeSmartAlbumVideos options:nil];
            self->_assetCollection = smartAlbums.firstObject;
            if (![self->_assetCollection isKindOfClass:[PHAssetCollection class]]) self->_assetCollection = nil;
            PHFetchOptions *allPhotosOptions = [[PHFetchOptions alloc] init];
            allPhotosOptions.sortDescriptors = @[[NSSortDescriptor sortDescriptorWithKey:@"creationDate" ascending:NO]];
            i = [PHAsset fetchAssetsInAssetCollection:self->_assetCollection options:allPhotosOptions];
            self->_assetsFetchResults = i;
        });
    }

    return i;
}