使用点击从视频中获取缩略图

时间:2015-02-07 04:08:49

标签: ios objective-c avfoundation uigesturerecognizer mpmovieplayercontroller

我需要播放视频。当视频播放时,我希望能够点击视频并提取它的缩略图。

我一直在阅读文档/ SO,似乎我可以通过AVFoundation / AVPlayer或MPMoviePlayerController捕获图像。两者都需要时间/时间范围才能获得图像。但是如何通过点击播放器来提取图像呢?

我想我可以使用UIGestureRecognizer,但如何创建点击和电影时间之间的关系?我应该使用AVFoundation还是MPMoviePlayerController?

感谢任何提示,我对此的经验有限。

1 个答案:

答案 0 :(得分:2)

我创建了一个代码来获取特定时间的帧,如下所示。请检查它可能会对你有所帮助。

-(void)getArrayOfFrameFromVideoURLs:(NSURL*)outputFileURL
{

    ImagesMainArray = [NSMutableArray array];
    AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:outputFileURL
                                                options:[NSDictionary      dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES],     AVURLAssetPreferPreciseDurationAndTimingKey, nil]] ;
    AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc]     initWithAsset:asset];
    generator.appliesPreferredTrackTransform = YES; // if I omit this, the frames are rotated 90° (didn't try in landscape)
    AVVideoComposition * composition = [AVVideoComposition videoCompositionWithPropertiesOfAsset:asset];

    // Retrieving the video properties
    NSTimeInterval duration = CMTimeGetSeconds(asset.duration);
    CGFloat frameDuration = CMTimeGetSeconds(composition.frameDuration);
    CGSize renderSize = composition.renderSize;
    CGFloat totalFrames = round(duration/frameDuration);

    // Selecting each frame we want to extract : all of them.

    // Get Count of Frames Per Second
    AVAssetTrack * videoATrack = [[asset tracksWithMediaType:AVMediaTypeVideo] lastObject];
    float fps = 0.0;
    if(videoATrack) {
        fps = videoATrack.nominalFrameRate;
    }
    NSLog(@"FramesPerSecond=%f",fps);
    NSMutableArray * times = [NSMutableArray     arrayWithCapacity:round(duration/frameDuration)];

    for (int i=0; i<totalFrames; i++) {
        NSValue *time = [NSValue valueWithCMTime:CMTimeMakeWithSeconds(i*frameDuration, composition.frameDuration.timescale)];
        [times addObject:time];
    }

    __block int i = 0;
    AVAssetImageGeneratorCompletionHandler handler = ^(CMTime requestedTime, CGImageRef im, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error){
        if (result == AVAssetImageGeneratorSucceeded) {
            [ImagesMainArray addObject:[UIImage imageWithCGImage:im]];
        }
        else
            NSLog(@"Ouch: %@", error.description);
        i++;
        if(i == totalFrames) {
            dispatch_async(dispatch_get_main_queue(), ^{
                [self showArrayOfImgsMain];
            });
        }
    };

    // Launching the process...
    generator.requestedTimeToleranceBefore = kCMTimeZero;
    generator.requestedTimeToleranceAfter = kCMTimeZero;
    generator.maximumSize = renderSize;
    [generator generateCGImagesAsynchronouslyForTimes:times completionHandler:handler];

}

您可以查看更多内容:

https://developer.apple.com/library/ios/samplecode/AVPlayerDemo/Introduction/Intro.htm

http://iosguy.com/tag/avplayer/

https://stackoverflow.com/a/16398642