使用UIImage和caf创建视频文件的问题

时间:2011-08-04 09:07:53

标签: iphone video audio uiimage

我已经阅读了我在互联网上可以找到的有关此功能的所有帖子,并且我在创建视频文件方面取得了一些成功,但我还有3个问题,似乎没有人提到过。

我有3个问题:

  1. 视频无法在某些播放器上正常播放:快速播放(窗口),视频播放仅一帧,屏幕变为白色,视频无法在youtube上播放。

    < / LI>
  2. 某些图像,由于某种原因,图像非常不正常

    http://lh3.googleusercontent.com/-Jyz-L1k3MEk/TjpfSfKf8LI/AAAAAAAADBs/D1GYuEqI-Oo/h301/1.JPG(好吧,他们说我是新用户,不允许我在帖子中发布图片。)

  3. 由于某些原因,某些图片的方向不正确,即使我根据方向改变了上下文,它仍然无效。

  4. 有人可以帮我这个,非常感谢你提前!!

    这是我的代码:

    1:使用此功能使用UIImage创建视频,我只使用一个图像和1个音频文件(caf),我想在播放音频时显示该图像。

    - (void)writeImageAndAudioAsMovie:(UIImage*)image andAudio:(NSString *)audioFilePath duration:(int)duration {
        NSLog(@"start make movie: length:%d",duration);
        NSError *error = nil;
        AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:ImageVideoPath] fileType:AVFileTypeMPEG4
                                                              error:&error];
        NSParameterAssert(videoWriter);
        if ([[NSFileManager defaultManager] fileExistsAtPath:ImageVideoPath]) 
            [[NSFileManager defaultManager] removeItemAtPath:ImageVideoPath error:nil];
    
        NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey,
                                   [NSNumber numberWithInt:image.size.width],AVVideoWidthKey,[NSNumber numberWithInt:image.size.height], AVVideoHeightKey,nil];
        AVAssetWriterInput* writerInput = [[AVAssetWriterInput
                                        assetWriterInputWithMediaType:AVMediaTypeVideo
                                        outputSettings:videoSettings] retain];
    
        AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:nil];
        NSParameterAssert(writerInput);
        NSParameterAssert([videoWriter canAddInput:writerInput]);
        writerInput.expectsMediaDataInRealTime = YES;
        [videoWriter setShouldOptimizeForNetworkUse:YES];
        [videoWriter addInput:writerInput];
    
        //Start a session:
        [videoWriter startWriting];
        [videoWriter startSessionAtSourceTime:kCMTimeZero];
    
        //Write samples:
        CVPixelBufferRef buffer = [self pixelBufferFromCGImage:image.CGImage];
        [adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
    
        //Finish the session:
        [videoWriter endSessionAtSourceTime:CMTimeMake(duration, 1)];
        [writerInput markAsFinished];
        [videoWriter finishWriting];
    
        CVPixelBufferPoolRelease(adaptor.pixelBufferPool);
        [videoWriter release];
        [writerInput release];
        [self addAudioToFileAtPath:ImageVideoPath andAudioPath:audioFilePath];
    }
    

    2。为视频创建CVPixelBufferRef

    -(CVPixelBufferRef)pixelBufferFromCGImage: (CGImageRef) image{
        float width = CGImageGetWidth(cgimage);
        float height = CGImageGetHeight(cgimage);
    
        NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                             nil];
        CVPixelBufferRef pxbuffer = NULL;
        CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, width,height, kCVPixelFormatType_32ARGB,(CFDictionaryRef)options,&pxbuffer);
    
        NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
    
        CVPixelBufferLockBaseAddress(pxbuffer, 0);
        void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    
        NSParameterAssert(pxdata != NULL);
    
        CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
        CGContextRef context = CGBitmapContextCreate(pxdata,width,height,8,4*width,rgbColorSpace,kCGImageAlphaNoneSkipFirst);
    
        NSParameterAssert(context);
        CGContextDrawImage(context, CGRectMake(0, 0,width, height), cgimage);
    
        CGColorSpaceRelease(rgbColorSpace);
        CGContextRelease(context);
    
        CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
    
        return pxbuffer;
    }
    

    3。把视频和音频放在一起

    -(void) addAudioToFileAtPath:(NSString *)vidoPath andAudioPath:(NSString *)audioPath{
        AVMutableComposition* mixComposition = [AVMutableComposition composition];
    
        NSURL* audio_inputFileUrl = [NSURL fileURLWithPath:audioPath];
        NSURL* video_inputFileUrl = [NSURL fileURLWithPath:vidoPath];
    
        NSString *outputFilePath = FinalVideoPath;
        NSURL* outputFileUrl = [NSURL fileURLWithPath:outputFilePath];
    
        if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath]) 
            [[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];
    
        AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil];
        CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
        AVMutableCompositionTrack *b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
        [b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];
    
    
        AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil];
        CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration);
        AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
        [a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
    
        //nextClipStartTime = CMTimeAdd(nextClipStartTime, a_timeRange.duration);
        [audioAsset release];audioAsset = nil;
        [videoAsset release];videoAsset = nil;
    
        AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];   
        _assetExport.outputFileType = AVFileTypeQuickTimeMovie;
        _assetExport.outputURL = outputFileUrl;
    
        [_assetExport exportAsynchronouslyWithCompletionHandler:
         ^(void ) {
             switch (_assetExport.status) 
             {
                 case AVAssetExportSessionStatusCompleted:
                     //export complete 
                     NSLog(@"Export Complete");
                     break;
                 case AVAssetExportSessionStatusFailed:
                     NSLog(@"Export Failed");
                     NSLog(@"ExportSessionError: %@", [_assetExport.error localizedDescription]);
                 //export error (see exportSession.error)  
                     break;
                 case AVAssetExportSessionStatusCancelled:
                     NSLog(@"Export Failed");
                     NSLog(@"ExportSessionError: %@", [_assetExport.error localizedDescription]);
                     //export cancelled  
                     break;
             }
          }];    
    }
    

1 个答案:

答案 0 :(得分:2)

有这个问题。如果您想要修复它,那么这是您的复选标记列表:

1)视频不能有alpha通道,所以你的pixelBufferFromCGImage应该是这样的

static OSType pixelFormatType = kCVPixelFormatType_32ARGB;


- (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image {
    CGSize frameSize = CGSizeMake(CGImageGetWidth(image), CGImageGetHeight(image));
    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                             @YES, kCVPixelBufferCGImageCompatibilityKey,
                             @YES, kCVPixelBufferCGBitmapContextCompatibilityKey,
                             nil];
    CVPixelBufferRef pxbuffer = NULL;
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
                                          frameSize.width,
                                          frameSize.height,
                                          pixelFormatType,
                                          (__bridge CFDictionaryRef)options,
                                          &pxbuffer);
    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGBitmapInfo bitmapInfo = kCGImageAlphaNoneSkipFirst & kCGBitmapAlphaInfoMask;

    //NSUInteger bytesPerRow = 4 * frameSize.width;
    NSUInteger bitsPerComponent = 8;
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(pxbuffer);

    CGContextRef context = CGBitmapContextCreate(pxdata,
                                                 frameSize.width,
                                                 frameSize.height,
                                                 bitsPerComponent,
                                                 bytesPerRow,
                                                 rgbColorSpace,
                                                 bitmapInfo);

    CGContextDrawImage(context, CGRectMake(0, 0, frameSize.width, frameSize.height), image);
    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

    return pxbuffer;
}

2)确保您在真实设备上进行测试。模拟器倾向于扭曲视频,我在使用模拟器制作视频时遇到了完全相同的问题。

3)确保你像这样创建AVAssetWriterInputPixelBufferAdaptor

NSMutableDictionary *attributes = [[NSMutableDictionary alloc] init];
[attributes setObject:[NSNumber numberWithUnsignedInt:pixelFormatType] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey];
[attributes setObject:[NSNumber numberWithUnsignedInt:imageSize.width] forKey:(NSString*)kCVPixelBufferWidthKey];
[attributes setObject:[NSNumber numberWithUnsignedInt:imageSize.height] forKey:(NSString*)kCVPixelBufferHeightKey];

AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                          assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                          sourcePixelBufferAttributes:attributes];

我还有其他几个问题,但没有扭曲的视频或方向问题。如果您不直接从资产请求缩略图图像,则需要将图像旋转到正确的方向。