我正在尝试从mp4文件生成缩略图,这是Xcode项目的一部分。我已成功使用ThumbGenerator类从Mp4生成缩略图,但无法为m3u8文件生成缩略图。所以,我正在尝试一种稍微不同的方法来生成Thumbnail,其中我从AVPlayer获取像素缓冲区输出并将其转换为UIImage。我为示例项目创建了一个GitHub Repo:https://github.com/dep2k/VideoThumbnail.git
在ViewDidLoad中,我调用load方法来创建AVPlayer并加载视频。
-(void)load {
NSURL * url = [[NSBundle bundleForClass:self.class] URLForResource:@"test" withExtension:@"mp4"];
self.playerItem = [AVPlayerItem playerItemWithURL:url];
self.player = [AVPlayer playerWithPlayerItem:self.playerItem];
AVPlayerLayer *layer = [AVPlayerLayer playerLayerWithPlayer:self.player];
layer.frame = self.view.frame;
layer.backgroundColor = [UIColor redColor].CGColor;
[self.view.layer addSublayer:layer];
[self.player addObserver:self forKeyPath:@"currentItem.status" options:NSKeyValueObservingOptionNew context:kStatusDidChangeKVO];
[self.player addObserver:self forKeyPath:@"currentItem.loadedTimeRanges" options:NSKeyValueObservingOptionNew context:kTimeRangesKVO];
}
在玩家观察者方法中,我有生成缩略图的代码:
- (void) observeValueForKeyPath:(NSString*)inKeyPath ofObject:(id)inObject change:(NSDictionary*)inChange context:(void*)inContext
{
if (inContext == kStatusDidChangeKVO){
AVPlayerItemStatus status = self.player.currentItem.status;
if (status == AVPlayerItemStatusReadyToPlay && self.player.status == AVPlayerStatusReadyToPlay) {
if(!self.isPlaying){
self.isPlaying = true;
self.player.rate = 1.0;
self.player.muted = YES;
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, 4 * NSEC_PER_SEC), dispatch_get_main_queue(), ^{
[self generateThumb];
});
}
}
} else if (inContext == kTimeRangesKVO){
NSArray *timeRanges = (NSArray *)[inChange objectForKey:NSKeyValueChangeNewKey];
NSLog(@"Time Ranges");
}
}
最后,为了生成缩略图,我有以下代码:
-(void)generateThumb {
dispatch_async(dispatch_get_global_queue( DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^(void){
NSDictionary* settings = @{ (id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] };
self.output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
[self.player.currentItem addOutput:self.output];
CMTime vTime = CMTimeMakeWithSeconds(90000, 90000);
BOOL foundFrame = [self.output hasNewPixelBufferForItemTime:vTime];
if(foundFrame){
CIContext *temporaryContext = [CIContext contextWithOptions:nil];
CVPixelBufferRef pixelBuffer = [self.output copyPixelBufferForItemTime:vTime itemTimeForDisplay:nil];
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];
CGImageRef videoImage = [temporaryContext
createCGImage:ciImage
fromRect:CGRectMake(0, 0,
288,
192)];
UIImage *image = [UIImage imageWithCGImage:videoImage];
CGImageRelease(videoImage);
NSLog(@"FrameFound");
}else{
NSLog(@"FrameNotFound");
}
});
}
hasNewPixelBufferForItemTime方法始终返回NO,即使播放器成功播放视频也是如此。