长度不同时合并音频和视频

时间:2012-02-07 16:18:48

标签: iphone avfoundation

我正在制作一个视频,使用this tutorial合并一个图像(只有一个帧)的电影文件和几秒钟的音频。

在iphone设备中,视频持续时间相当于音频持续时间,我在所有视频中看到了图像。

但是当我分享到Android设备(通过whatsapp)并且按下播放时,播放持续时间是来自图像持续时间(一帧)的电影。我做了一个测试,如果我从一个图像创建一个电影文件重复一百次(10fps,十秒),在Android设备中,playblack持续时间是十秒。

我认为Android设备只播放视频中最短的音轨,但如果我将视频的 addMutableTrackWithMediaType 中的时间范围修改为音频持续时间,则不会发生任何事情。

有什么建议吗?

感谢您的支持

我把所有代码放在这里:

-(void) writeImagesToMovieAtPath:(NSString *)path withSize:(CGSize) size {

    NSMutableArray *m_PictArray = [NSMutableArray arrayWithCapacity:1];
    [m_PictArray addObject:[UIImage imageNamed:@"prueba.jpg"]];

    NSString *documentsDirectoryPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
    NSArray *dirContents = [[NSFileManager defaultManager] contentsOfDirectoryAtPath:documentsDirectoryPath error:nil];
    for (NSString *tString in dirContents) {
        if ([tString isEqualToString:@"essai.mp4"]) 
        {
            [[NSFileManager defaultManager]removeItemAtPath:[NSString stringWithFormat:@"%@/%@",documentsDirectoryPath,tString] error:nil];

        }
    }

    NSLog(@"Write Started");

    NSError *error = nil;

    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                                  [NSURL fileURLWithPath:path] fileType:AVFileTypeMPEG4
                                                              error:&error];    
    NSParameterAssert(videoWriter);

    NSDictionary *codecSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   [NSNumber numberWithInt:128000], AVVideoAverageBitRateKey,
                                   [NSNumber numberWithInt:15],AVVideoMaxKeyFrameIntervalKey,
                                   AVVideoProfileLevelH264Main30, AVVideoProfileLevelKey,
                                   nil];    

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   AVVideoCodecH264, AVVideoCodecKey,
                                   codecSettings,AVVideoCompressionPropertiesKey,
                                   [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                                   [NSNumber numberWithInt:size.height], AVVideoHeightKey,
                                   nil];    

    AVAssetWriterInput* videoWriterInput = [[AVAssetWriterInput
                                             assetWriterInputWithMediaType:AVMediaTypeVideo
                                             outputSettings:videoSettings] retain];

    AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                     assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
                                                     sourcePixelBufferAttributes:nil];

    NSParameterAssert(videoWriterInput);

    NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
    videoWriterInput.expectsMediaDataInRealTime = YES;
    [videoWriter addInput:videoWriterInput];
    //Start a session:
    [videoWriter startWriting];
    [videoWriter startSessionAtSourceTime:kCMTimeZero];


    //Video encoding

    CVPixelBufferRef buffer = NULL;

    //convert uiimage to CGImage.

    int frameCount = 0;

    for(int i = 0; i<[m_PictArray count]; i++)
    {
        buffer = [self newPixelBufferFromCGImage:[[m_PictArray objectAtIndex:i] CGImage] andSize:size];

        BOOL append_ok = NO;
        int j = 0;
        while (!append_ok && j < 30) 
        {
            if (adaptor.assetWriterInput.readyForMoreMediaData) 
            {
                printf("appending %d attemp %d\n", frameCount, j);

                CMTime frameTime = CMTimeMake(frameCount,(int32_t) 10);
                /*
                Float64 seconds = 1; 
                int32_t preferredTimeScale = 10;
                CMTime frameTime = CMTimeMakeWithSeconds(seconds, preferredTimeScale);
                */
                append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
                CVPixelBufferPoolRef bufferPool = adaptor.pixelBufferPool;
                NSParameterAssert(bufferPool != NULL);

                [NSThread sleepForTimeInterval:0.05];
            } 
            else 
            {
                printf("adaptor not ready %d, %d\n", frameCount, j);
                [NSThread sleepForTimeInterval:0.1];
            }
            j++;
        }
        if (!append_ok) {
            printf("error appending image %d times %d\n", frameCount, j);
        }
        frameCount++;
        CVBufferRelease(buffer);
    }

    [videoWriterInput markAsFinished];
    [videoWriter finishWriting];

    [videoWriterInput release];
    [videoWriter release];

    [m_PictArray removeAllObjects];

    NSLog(@"Write Ended");

    [self saveVideoToAlbum:path]; 
}


-(void)CompileFilesToMakeMovie {

    NSLog(@"CompileFilesToMakeMovie");

    AVMutableComposition* mixComposition = [AVMutableComposition composition];

    NSString *documentsDirectoryPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];    

    //Audio file in AAC
    NSString* audio_inputFileName = @"zApY4o8QY.m4a";

    NSString* audio_inputFilePath = [NSString stringWithFormat:@"%@/%@",[[NSBundle mainBundle] resourcePath],audio_inputFileName];
    NSURL*    audio_inputFileUrl = [NSURL fileURLWithPath:audio_inputFilePath];

    NSString* video_inputFileName = @"essai.mp4";
    NSString* video_inputFilePath = [NSString stringWithFormat:@"%@/%@",documentsDirectoryPath,video_inputFileName];
    NSURL*    video_inputFileUrl = [NSURL fileURLWithPath:video_inputFilePath];

    NSString* outputFileName = @"outputFile.mov";
    NSString* outputFilePath = [NSString stringWithFormat:@"%@/%@",documentsDirectoryPath,outputFileName];

    NSURL*    outputFileUrl = [NSURL fileURLWithPath:outputFilePath];

    if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath]) 
        [[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];


    CMTime nextClipStartTime = kCMTimeZero;

    AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil];
    AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil];


    //CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration);
    CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration);
    AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    [a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];

    CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
    AVMutableCompositionTrack *b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    [b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:nextClipStartTime error:nil];



    AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetLowQuality];   
    _assetExport.shouldOptimizeForNetworkUse = YES;
    _assetExport.outputFileType = @"com.apple.quicktime-movie";
    _assetExport.outputURL = outputFileUrl;
    _assetExport.timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);

    [_assetExport exportAsynchronouslyWithCompletionHandler:
     ^(void ) {
         [self saveVideoToAlbum:outputFilePath]; 
     }       
     ];

    NSLog(@"CompileFilesToMakeMovie Finish");
}

- (void) saveVideoToAlbum:(NSString*)path {

    NSLog(@"saveVideoToAlbum");

    if(UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(path)){
        UISaveVideoAtPathToSavedPhotosAlbum (path, self, @selector(video:didFinishSavingWithError: contextInfo:), nil);
    }
}

-(void) video:(NSString *)videoPath didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo {
    if(error)
        NSLog(@"Exportado con error: %@", error);
    else 
        NSLog(@"Exportado OK");
} 

- (CVPixelBufferRef) newPixelBufferFromCGImage: (CGImageRef)image andSize:(CGSize)frameSize {

    CGAffineTransform frameTransform = CGAffineTransformMake(0, 0, 0, 0, 0, 0);

    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                             nil];
    CVPixelBufferRef pxbuffer = NULL;

    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, frameSize.width,
                                          frameSize.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, 
                                          &pxbuffer);
    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    NSParameterAssert(pxdata != NULL);

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pxdata, frameSize.width,
                                                 frameSize.height, 8, 4*frameSize.width, rgbColorSpace, 
                                                 kCGImageAlphaNoneSkipFirst);
    NSParameterAssert(context);
    //CGContextConcatCTM(context, frameTransform);
    CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), 
                                           CGImageGetHeight(image)), image);
    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

    return (CVPixelBufferRef)pxbuffer;
}

2 个答案:

答案 0 :(得分:0)

刚修好!

我创建的电影文件重复X次图像,然后在合成过程中我缩放到audioAsset.duration的大小

CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration);
AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
[a_compositionVideoTrack scaleTimeRange:video_timeRange toDuration:audioAsset.duration];

你需要重复一次图像以允许跟踪缩放,但如果电影只有2帧,在Android中只播放8秒,所以我制作了一个重复10次图像的视频,让我超出限制在whatsapp的视频分享中的45个secons

答案 1 :(得分:0)

在CompileFilesToMakeMovie方法中,在需要的地方使用video_timeRange而不是audio_timeRange ...