我想在MPMoviePlayerViewController播放的视频中添加文本或字符串,使其成为正在播放的视频的一部分。因此,当我在Facebook或Twitter上发布该视频时,文本必须显示在视频上方。
为此,我尝试获取视频的所有帧,然后在每个帧上写入文本,然后再制作所有这些帧的视频。但是这样,我遇到了内存问题,并且它在设备上崩溃了。
- (NSArray*)getVideoFramesFromMovieController:(MPMoviePlayerViewController*)mpMoviePlayerVC
{ NSLog(@"Getting frames from a video asset.");
// videoFrames = [NSMutableArray array];
NSMutableArray *videoFrames = [NSMutableArray array];
for(float i= 0; i <= mpMoviePlayerVC.moviePlayer.duration; )
{
UIImage *singleFrameImage = [mpMoviePlayerVC.moviePlayer thumbnailImageAtTime:i timeOption:MPMovieTimeOptionExact];
[videoFrames addObject:singleFrameImage];
NSLog(@"Got frame number : %d",[videoFrames count]);
i = i + (1/self.frameRate) ; //frame capturing duration i.e. 15fps //self.frameRate
}
NSLog(@"Total frames: %d",[videoFrames count]);
return [NSArray arrayWithArray:videoFrames];
}
上面的方法给了我所有的框架,我在所有这些文字上写了一个文字,说“你好”,然后制作所有这些框架的视频。
-(void)writeImageAsMovie:(NSArray *)array toPath:(NSString*)path size:(CGSize)size
{
NSLog(@"Inside writeImageAsMovie method.");
NSError *error = nil;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:path] fileType:AVFileTypeMPEG4
error:&error];
NSParameterAssert(videoWriter);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:size.width], AVVideoWidthKey,
[NSNumber numberWithInt:size.height], AVVideoHeightKey,
nil];
AVAssetWriterInput* writerInput = [[AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings] retain];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
sourcePixelBufferAttributes:nil];
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);
[videoWriter addInput:writerInput];
//Start a session:
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
dispatch_queue_t dispatchQueue = dispatch_queue_create("mediaInputQueue", NULL);
int __block frame = 0;
[writerInput requestMediaDataWhenReadyOnQueue:dispatchQueue usingBlock:^{
while ([writerInput isReadyForMoreMediaData])
{
NSLog(@"Total frames to be written: %d",[array count]);
if(++frame >= [array count]) //total frames
{
[writerInput markAsFinished];
[videoWriter finishWriting];
[videoWriter release];
break;
}
CVPixelBufferRef buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:[[array objectAtIndex:frame]CGImage] andSize:size];
if (buffer)
{
if(![adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(frame, self.frameRate)])
NSLog(@"FAIL");
else
NSLog(@"Success:%d", frame);
CFRelease(buffer);
}
}
}];
NSLog(@"outside for loop");
[self performSelector:@selector(waitTillVideoFinishes) withObject:nil afterDelay:20.0];
}
它在mac上正常工作,但由于内存问题导致设备崩溃。
我也尝试过各种方法为视频上的文字添加水印但无法通过。 提前谢谢。
答案 0 :(得分:2)
此代码在视频上添加文字或字符串,保存视频后,您将在任何播放器上播放。 此代码的最大优点是为视频提供声音。
#import <AVFoundation/AVFoundation.h>
-(void)MixVideoWithText
{
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:url options:nil];
AVMutableComposition* mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *clipVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *clipAudioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
//If you need audio as well add the Asset Track for audio here
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:clipVideoTrack atTime:kCMTimeZero error:nil];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:clipAudioTrack atTime:kCMTimeZero error:nil];
[compositionVideoTrack setPreferredTransform:[[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] preferredTransform]];
CGSize sizeOfVideo=[videoAsset naturalSize];
//NSLog(@"sizeOfVideo.width is %f",sizeOfVideo.width);
//NSLog(@"sizeOfVideo.height is %f",sizeOfVideo.height);
//TextLayer defines the text they want to add in Video
CATextLayer *textOfvideo=[[CATextLayer alloc] init];
textOfvideo.string=[NSString stringWithFormat:@"%@",text];//text is shows the text that you want add in video.
[textOfvideo setFont:(__bridge CFTypeRef)([UIFont fontWithName:[NSString stringWithFormat:@"%@",fontUsed] size:13])];//fontUsed is the name of font
[textOfvideo setFrame:CGRectMake(0, 0, sizeOfVideo.width, sizeOfVideo.height/6)];
[textOfvideo setAlignmentMode:kCAAlignmentCenter];
[textOfvideo setForegroundColor:[selectedColour CGColor]];
CALayer *optionalLayer=[CALayer layer];
[optionalL addSublayer:textOfvideo];
optionalL.frame=CGRectMake(0, 0, sizeOfVideo.width, sizeOfVideo.height);
[optionalL setMasksToBounds:YES];
CALayer *parentLayer=[CALayer layer];
CALayer *videoLayer=[CALayer layer];
parentLayer.frame=CGRectMake(0, 0, sizeOfVideo.width, sizeOfVideo.height);
videoLayer.frame=CGRectMake(0, 0, sizeOfVideo.width, sizeOfVideo.height);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:optionalLayer];
AVMutableVideoComposition *videoComposition=[AVMutableVideoComposition videoComposition] ;
videoComposition.frameDuration=CMTimeMake(1, 10);
videoComposition.renderSize=sizeOfVideo;
videoComposition.animationTool=[AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [mixComposition duration]);
AVAssetTrack *videoTrack = [[mixComposition tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableVideoCompositionLayerInstruction* layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction];
videoComposition.instructions = [NSArray arrayWithObject: instruction];
NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)objectAtIndex:0];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:@"yyyy-MM-dd_HH-mm-ss"];
NSString *destinationPath = [documentsDirectory stringByAppendingFormat:@"/utput_%@.mov", [dateFormatter stringFromDate:[NSDate date]]];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality];
exportSession.videoComposition=videoComposition;
exportSession.outputURL = [NSURL fileURLWithPath:destinationPath];
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status)
{
case AVAssetExportSessionStatusCompleted:
NSLog(@"Export OK");
if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(destinationPath)) {
UISaveVideoAtPathToSavedPhotosAlbum(destinationPath, self, @selector(video:didFinishSavingWithError:contextInfo:), nil);
}
break;
case AVAssetExportSessionStatusFailed:
NSLog (@"AVAssetExportSessionStatusFailed: %@", exportSession.error);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(@"Export Cancelled");
break;
}
}];
}
显示保存视频后的错误。
-(void) video: (NSString *) videoPath didFinishSavingWithError: (NSError *) error contextInfo: (void *) contextInfo
{
if(error)
NSLog(@"Finished saving video with error: %@", error);
}
答案 1 :(得分:1)
我知道这是一个老问题,但我只是有同样的要求,我找不到任何真正简单快速的解决方案(为了快速测试我可以做到),直到找到GPUImage issue: #110
Brad Larson建议,使用GPUImage framework可以在视频中录制包含现有过滤器或/和UIKit元素的视频。
因此,基于问题#110中的信息,我将GPUImage FilterShowcase项目与SimpleVideoFilter项目结合起来,以快速测试是否可行。
当然有效。
所以我写这个答案是为了提供快速示例如何做到:
打开FilterShowcase项目,
打开ShowcaseFilterViewController.m文件,找到else if (filterType == GPUIMAGE_UIELEMENT)
在代码之后你会发现 - 添加这段代码:
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Movie.m4v"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
GPUImageMovieWriter *movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(480.0, 640.0)];
[blendFilter addTarget:movieWriter];
[videoCamera startCameraCapture];
double delayToStartRecording = 0.5;
dispatch_time_t startTime2 = dispatch_time(DISPATCH_TIME_NOW, delayToStartRecording * NSEC_PER_SEC);
dispatch_after(startTime2, dispatch_get_main_queue(), ^(void){
NSLog(@"Start recording");
videoCamera.audioEncodingTarget = movieWriter;
[movieWriter startRecording];
double delayInSeconds = 5.0;
dispatch_time_t stopTime = dispatch_time(DISPATCH_TIME_NOW, delayInSeconds * NSEC_PER_SEC);
dispatch_after(stopTime, dispatch_get_main_queue(), ^(void){
[blendFilter removeTarget:movieWriter];
videoCamera.audioEncodingTarget = nil;
[movieWriter finishRecording];
NSLog(@"Movie completed");
UISaveVideoAtPathToSavedPhotosAlbum(pathToMovie, nil, NULL, NULL);
});
});
[videoCamera startCameraCapture];
return;
之后,在iPhone上编译FilterShowCase项目,并从列表中选择UI element
。大约7-10秒后,您的照片库中将保存一段视频。