我目前正在使用AVAssetImageGenerator
从视频中提取每一帧,但有时它会连续两次返回我相同的图像(它们没有相同的“帧时间”)。有趣的是它总是发生(在我的测试视频中)每5帧。
Here和here是两张图片(在新标签中分别打开,然后切换标签以查看差异)。
这是我的代码:
//setting up generator & compositor
self.generator = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];
generator.appliesPreferredTrackTransform = YES;
self.composition = [AVVideoComposition videoCompositionWithPropertiesOfAsset:asset];
NSTimeInterval duration = CMTimeGetSeconds(asset.duration);
NSTimeInterval frameDuration = CMTimeGetSeconds(composition.frameDuration);
CGFloat totalFrames = round(duration/frameDuration);
NSMutableArray * times = [NSMutableArray array];
for (int i=0; i<totalFrames; i++) {
NSValue * time = [NSValue valueWithCMTime:CMTimeMakeWithSeconds(i*frameDuration, composition.frameDuration.timescale)];
[times addObject:time];
}
AVAssetImageGeneratorCompletionHandler handler = ^(CMTime requestedTime, CGImageRef im, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error){
// If actualTime is not equal to requestedTime image is ignored
if(CMTimeCompare(actualTime, requestedTime) == 0) {
if (result == AVAssetImageGeneratorSucceeded) {
NSLog(@"%.02f %.02f", CMTimeGetSeconds(requestedTime), CMTimeGetSeconds(actualTime));
// Each log have differents actualTimes.
// frame extraction is here...
}
}
};
generator.requestedTimeToleranceBefore = kCMTimeZero;
generator.requestedTimeToleranceAfter = kCMTimeZero;
[generator generateCGImagesAsynchronouslyForTimes:times completionHandler:handler];
知道它可能来自哪里?
答案 0 :(得分:14)
请参阅AVAssetImageGenerator的以下属性。 您应该为两个属性设置kCMTimeZero以获取确切的帧。
/* The actual time of the generated images will be within the range [requestedTime-toleranceBefore, requestedTime+toleranceAfter] and may differ from the requested time for efficiency.
Pass kCMTimeZero for both toleranceBefore and toleranceAfter to request frame-accurate image generation; this may incur additional decoding delay.
Default is kCMTimePositiveInfinity. */
@property (nonatomic) CMTime requestedTimeToleranceBefore NS_AVAILABLE(10_7, 5_0);
@property (nonatomic) CMTime requestedTimeToleranceAfter NS_AVAILABLE(10_7, 5_0);
在为两个属性设置kCMTimeZero之前,我为您提供了与您所经历的不同请求时间相同的图像。 只需尝试以下代码即可。
self.imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:myAsset];
self.imageGenerator.requestedTimeToleranceBefore = kCMTimeZero;
self.imageGenerator.requestedTimeToleranceAfter = kCMTimeZero;
答案 1 :(得分:2)
我使用稍微不同的方式来计算CMTime请求,它似乎有效。这是代码(假设iOS):
-(void)extractImagesFromMovie {
// set the asset
NSString* path = [[NSBundle mainBundle] pathForResource:@"myMovie" ofType:@"MOV"];
NSURL* movURL = [NSURL fileURLWithPath:path];
NSMutableDictionary* myDict = [NSMutableDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES] ,
AVURLAssetPreferPreciseDurationAndTimingKey ,
[NSNumber numberWithInt:0],
AVURLAssetReferenceRestrictionsKey, nil];
AVURLAsset* movie = [[AVURLAsset alloc] initWithURL:movURL options:myDict];
// set the generator
AVAssetImageGenerator* generator = [[AVAssetImageGenerator assetImageGeneratorWithAsset:movie] retain];
generator.requestedTimeToleranceBefore = kCMTimeZero;
generator.requestedTimeToleranceAfter = kCMTimeZero;
// look for the video track
AVAssetTrack* videoTrack;
bool foundTrack = NO;
for (AVAssetTrack* track in movie.tracks) {
if ([track.mediaType isEqualToString:@"vide"]) {
if (foundTrack) {NSLog (@"Error - - - more than one video tracks"); return(-1);}
else {
videoTrack = track;
foundTrack = YES;
}
}
}
if (foundTrack == NO) {NSLog (@"Error - - No Video Tracks at all"); return(-1);}
// set the number of frames in the movie
int frameRate = videoTrack.nominalFrameRate;
float value = movie.duration.value;
float timeScale = movie.duration.timescale;
float totalSeconds = value / timeScale;
int totalFrames = totalSeconds * frameRate;
NSLog (@"total frames %d", totalFrames);
int timeValuePerFrame = movie.duration.timescale / frameRate;
NSMutableArray* allFrames = [[NSMutableArray new] retain];
// get each frame
for (int k=0; k< totalFrames; k++) {
int timeValue = timeValuePerFrame * k;
CMTime frameTime;
frameTime.value = timeValue;
frameTime.timescale = movie.duration.timescale;
frameTime.flags = movie.duration.flags;
frameTime.epoch = movie.duration.epoch;
CMTime gotTime;
CGImageRef myRef = [generator copyCGImageAtTime:frameTime actualTime:&gotTime error:nil];
[allFrames addObject:[UIImage imageWithCGImage:myRef]];
if (gotTime.value != frameTime.value) NSLog (@"requested %lld got %lld for k %d", frameTime.value, gotTime.value, k)
}
NSLog (@"got %d images in the array", [allFrames count]);
// do something with images here...
}
答案 2 :(得分:2)
我遇到了同样的问题,但更明显的是,当两帧之间的间隔小于1.0秒时,重复发生了,我意识到这取决于我用来生成CMTime值的时间刻度。 / p>
在
CMTime requestTime = CMTimeMakeWithSeconds(imageTime, 1);
在
CMTime requestTime = CMTimeMakeWithSeconds(imageTime, playerItem.asset.duration.timescale);
......和Boom,不再重复:)
所以也许你可以尝试使用你的代码来增加时间尺度,加倍:
NSValue * time = [NSValue valueWithCMTime:CMTimeMakeWithSeconds(i*frameDuration, composition.frameDuration.timescale*2)]; // *2 at the end
以下是我的代码:
playerItem = [AVPlayerItem playerItemWithURL:item.movieUrl];
imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:playerItem.asset];
imageGenerator.requestedTimeToleranceAfter = kCMTimeZero;
imageGenerator.requestedTimeToleranceBefore = kCMTimeZero;
float duration = item.duration;
float interval = item.interval;
NSLog(@"\nItem info:\n%f \n%f", duration,interval);
NSString *srcPath = nil;
NSString *zipPath = nil;
srcPath = [item.path stringByAppendingPathComponent:@"info.json"];
zipPath = [NSString stringWithFormat:@"/%@/info.json",galleryID];
[zip addFileToZip:srcPath newname:zipPath level:0];
NSTimeInterval frameNum = item.duration / item.interval;
for (int i=0; i<=frameNum; i++)
{
NSArray* cachePathArray = NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES);
NSString* cachePath = [cachePathArray lastObject];
srcPath = [cachePath stringByAppendingPathComponent:@"export-tmp.jpg"];
zipPath = [NSString stringWithFormat:@"/%@/%d.jpg",galleryID,i];
float imageTime = ( i * interval );
NSError *error = nil;
CMTime requestTime = CMTimeMakeWithSeconds(imageTime, playerItem.asset.duration.timescale);
CMTime actualTime;
CGImageRef imageRef = [imageGenerator copyCGImageAtTime:requestTime actualTime:&actualTime error:&error];
if (error == nil) {
float req = ((float)requestTime.value/requestTime.timescale);
float real = ((float)actualTime.value/actualTime.timescale);
float diff = fabsf(req-real);
NSLog(@"copyCGImageAtTime: %.2f, %.2f, %f",req,real,diff);
}
else
{
NSLog(@"copyCGImageAtTime: error: %@",error.localizedDescription);
}
// consider using CGImageDestination -> http://stackoverflow.com/questions/1320988/saving-cgimageref-to-a-png-file
UIImage *img = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef); // CGImageRef won't be released by ARC
[UIImageJPEGRepresentation(img, 100) writeToFile:srcPath atomically:YES];
if (srcPath != nil && zipPath!= nil)
{
[zip addFileToZip:srcPath newname:zipPath level:0]; // 0 = no compression. everything is a jpg image
unlink([srcPath UTF8String]);
}