我有这个代码可以获得两个不同视频的视频帧。我将它们合并为单帧,然后创建一个合并帧的电影。但问题是应用程序因内存警告而崩溃。这是我的代码:
NSString *filePath = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"/Documents/movie.mp4"]];
NSURL *outputURL = [NSURL fileURLWithPath:filePath];
player = [[MPMoviePlayerController alloc]initWithContentURL:outputURL];
float frame = 0.00;
int count = 10;
NSFileManager *fileManager = [NSFileManager defaultManager];
NSString *docPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
docPath = [docPath stringByAppendingPathComponent:@"OutPut"];
BOOL success = [fileManager fileExistsAtPath:docPath];
if (success) {
[fileManager removeItemAtPath:docPath error:nil];
}
[fileManager createDirectoryAtPath:docPath withIntermediateDirectories:YES attributes:nil error:nil];
for (frame = (frameStartTime); frame < (frameStartTime+5); frame+=0.033) {
UIImage * singleFrameImage = [player thumbnailImageAtTime:frame timeOption:MPMovieTimeOptionExact];
[player pause];
NSString *imageName = [NSString stringWithFormat:@"export2%d.png",count];
NSString * file = [[NSBundle mainBundle]pathForResource:imageName ofType:nil];
UIImage *overlayImage = [UIImage imageWithData:[NSData dataWithContentsOfFile:file]];
count = count + 1;
NSString *imagePath = [NSString stringWithFormat:@"%@/%@", docPath, imageName];
if (overlayImage) {
UIImage * outImage = [self mergeImage:singleFrameImage withImage:overlayImage];
NSData *imgData = [[NSData alloc] initWithData:UIImagePNGRepresentation(outImage)];
[fileManager createFileAtPath:imagePath contents:imgData attributes:nil];
[imgData release];
}
else {
NSData *imgData = UIImagePNGRepresentation(singleFrameImage);
[fileManager createFileAtPath:imagePath contents:imgData attributes:nil];
}
[outputFramesArray addObject:imagePath];
}
[player release];
if([fileManager fileExistsAtPath:filePath]) {
[fileManager removeItemAtPath:filePath error:nil];
}
NSString *path = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"/Documents/movie1.mp4"]];
NSLog(@"filePath %@", path);
if ([[NSFileManager defaultManager] fileExistsAtPath:path]) {
[[NSFileManager defaultManager] removeItemAtPath:path error:nil];
}
[self writeImageAsMovie:outputFramesArray toPath:path size:CGSizeMake(480, 320) duration:10];
NSLog(@"hello Your layering is completed");
[outputFramesArray removeAllObjects];
[outputFramesArray release];
仪器中的分配在此行中最大(90%):
NSData *imgData = [[NSData alloc] initWithData:UIImagePNGRepresentation(outImage)];
在我的整个应用程序中,仅在此代码点的分配级别上升到130MB。
你们有人建议任何解决方案吗?
答案 0 :(得分:1)
这很像是因为你在你的循环中占用了内存而当前堆栈中分配的内存在你返回主循环之前变得太大了。
阅读此blog post了解详情。
这样做:
...
for (frame = (frameStartTime); frame < (frameStartTime+5); frame+=0.033)
{
@autoreleasepool {
// Do whatever you do in your for loop...
}
}
...
它将创建一个在每次迭代时耗尽的本地池。