我用AVPlayer播放视频。它运作正常。
现在我想从视频播放中获得UIImage
(当我暂时按下按钮时)。
我的AVPlayer
附加了一个CALayer
,用于在UIView
上显示视频。
我的想法是在播放视频时从UIImage
获得CALayer
。
我使用其他问题的代码执行此操作:
UIImage from CALayer - iPhone SDK
但是我的UIImage
是空的。分辨率很好,但它完全是白色!!!
视频似乎没有写出CALayer
的内容。
答案 0 :(得分:8)
我无法让Meet的解决方案为我工作,但它让我思考正确的方向。
以下是我最终在项目中使用的代码。方法screenshotFromPlayer:maximumSize:
接受从中获取屏幕截图的AVPlayer
实例,以及将作为返回图像的最大大小的CGSize
。
- (UIImage *)screenshotFromPlayer:(AVPlayer *)player maximumSize:(CGSize)maxSize {
CMTime actualTime;
NSError *error;
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:player.currentItem.asset];
// Setting a maximum size is not necessary for this code to
// successfully get a screenshot, but it was useful for my project.
generator.maximumSize = maxSize;
CGImageRef cgIm = [generator copyCGImageAtTime:player.currentTime
actualTime:&actualTime
error:&error];
UIImage *image = [UIImage imageWithCGImage:cgIm];
CFRelease(cgIm);
if (nil != error) {
NSLog(@"Error making screenshot: %@", [error localizedDescription]);
NSLog(@"Actual screenshot time: %f Requested screenshot time: %f", CMTimeGetSeconds(actualTime),
CMTimeGetSeconds(self.recordPlayer.currentTime));
return nil;
}
return image;
}
另请注意,可以使用方法generateCGImagesAsynchronouslyForTimes:completionHandler:
代替copyCGImageAtTime:actualTime:error:
(在AVAssetImageGenerator
的实例上)以异步方式执行图像生成。
此代码示例在AVPlayer
的{{1}}生成屏幕截图,但可以随时使用。
答案 1 :(得分:8)
从avplayer获取图片的代码。
- (UIImage *)currentItemScreenShot
{
AVPlayer *abovePlayer = [abovePlayerView player];
CMTime time = [[abovePlayer currentItem] currentTime];
AVAsset *asset = [[[abovePlayerView player] currentItem] asset];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
if ([imageGenerator respondsToSelector:@selector(setRequestedTimeToleranceBefore:)] && [imageGenerator respondsToSelector:@selector(setRequestedTimeToleranceAfter:)]) {
[imageGenerator setRequestedTimeToleranceBefore:kCMTimeZero];
[imageGenerator setRequestedTimeToleranceAfter:kCMTimeZero];
}
CGImageRef imgRef = [imageGenerator copyCGImageAtTime:time
actualTime:NULL
error:NULL];
if (imgRef == nil) {
if ([imageGenerator respondsToSelector:@selector(setRequestedTimeToleranceBefore:)] && [imageGenerator respondsToSelector:@selector(setRequestedTimeToleranceAfter:)]) {
[imageGenerator setRequestedTimeToleranceBefore:kCMTimePositiveInfinity];
[imageGenerator setRequestedTimeToleranceAfter:kCMTimePositiveInfinity];
}
imgRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL];
}
UIImage *image = [[UIImage alloc] initWithCGImage:imgRef];
CGImageRelease(imgRef);
[imageGenerator release];
return [image autorelease];
}
如果您需要精确的时间,请设置[imageGenerator setRequestedTimeToleranceBefore:kCMTimeZero]
和[imageGenerator setRequestedTimeToleranceAfter:kCMTimeZero]
。
答案 2 :(得分:1)
尝试使用AVAssetImageGenerator从指定实例的视频文件中获取图像:
AVURLAsset *asset=[[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:[info objectForKey:@"UIImagePickerControllerReferenceURL"]] options:nil]; AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset]; generator.appliesPreferredTrackTransform=TRUE; [asset release]; CMTime thumbTime = CMTimeMakeWithSeconds(0,30); AVAssetImageGeneratorCompletionHandler handler = ^(CMTime requestedTime, CGImageRef im, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error){ if (result != AVAssetImageGeneratorSucceeded) { } UIImage *thumbImg = [[UIImage imageWithCGImage:im] retain]; [generator release];